Apr 20 14:51:12.629355 ip-10-0-140-93 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 14:51:12.629370 ip-10-0-140-93 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 14:51:12.629378 ip-10-0-140-93 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 14:51:12.629674 ip-10-0-140-93 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 14:51:22.660489 ip-10-0-140-93 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 14:51:22.660509 ip-10-0-140-93 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot a3ec68779cd043fd90befb12809c99f7 -- Apr 20 14:53:48.194336 ip-10-0-140-93 systemd[1]: Starting Kubernetes Kubelet... Apr 20 14:53:48.623385 ip-10-0-140-93 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:53:48.623385 ip-10-0-140-93 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 14:53:48.623385 ip-10-0-140-93 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:53:48.623385 ip-10-0-140-93 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 14:53:48.623385 ip-10-0-140-93 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:53:48.625900 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.625817 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 14:53:48.628807 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628793 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:48.628807 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628808 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628811 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628815 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628818 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628821 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628824 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628827 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628829 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628832 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628835 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628838 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628841 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628844 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628846 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628851 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628855 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628858 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628861 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628863 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:48.628868 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628866 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628869 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628872 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628875 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628878 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628880 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628883 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628886 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628889 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628891 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628894 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628897 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628899 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628901 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628904 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628906 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628909 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628911 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628914 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628916 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:48.629340 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628919 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628921 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628924 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628926 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628932 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628936 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628939 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628942 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628944 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628947 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628949 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628952 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628954 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628957 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628960 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628963 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628965 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628968 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628970 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:48.629859 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628973 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628975 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628978 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628981 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628983 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628986 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628988 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628991 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628993 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628996 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.628998 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629001 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629003 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629006 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629009 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629011 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629014 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629028 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629032 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629035 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:48.630341 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629039 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629041 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629044 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629047 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629049 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629052 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629055 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629420 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629425 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629428 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629431 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629434 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629436 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629439 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629442 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629444 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629447 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629449 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629452 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629455 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:48.630828 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629458 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629460 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629462 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629465 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629468 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629470 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629473 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629475 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629477 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629480 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629483 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629485 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629488 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629490 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629493 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629495 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629498 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629500 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629503 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629507 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:48.631324 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629509 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629512 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629515 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629517 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629519 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629522 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629525 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629527 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629529 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629532 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629534 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629537 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629540 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629542 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629545 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629547 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629550 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629553 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629555 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629558 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:48.631810 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629560 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629565 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629569 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629572 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629574 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629577 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629579 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629582 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629584 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629586 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629589 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629592 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629595 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629597 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629600 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629602 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629604 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629607 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629610 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629612 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:48.632321 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629615 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629617 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629620 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629623 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629626 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629628 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629631 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629633 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629636 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629638 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629641 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629643 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.629645 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630198 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630207 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630215 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630220 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630224 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630227 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630232 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630236 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 14:53:48.632854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630239 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630242 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630247 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630251 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630254 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630257 2575 flags.go:64] FLAG: --cgroup-root="" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630260 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630262 2575 flags.go:64] FLAG: --client-ca-file="" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630265 2575 flags.go:64] FLAG: --cloud-config="" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630268 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630271 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630275 2575 flags.go:64] FLAG: --cluster-domain="" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630278 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630281 2575 flags.go:64] FLAG: --config-dir="" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630284 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630287 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630291 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630294 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630297 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630301 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630304 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630307 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630310 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630313 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630316 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 14:53:48.633374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630320 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630323 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630326 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630330 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630334 2575 flags.go:64] FLAG: --enable-server="true" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630336 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630341 2575 flags.go:64] FLAG: --event-burst="100" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630345 2575 flags.go:64] FLAG: --event-qps="50" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630348 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630351 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630355 2575 flags.go:64] FLAG: --eviction-hard="" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630363 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630366 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630370 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630373 2575 flags.go:64] FLAG: --eviction-soft="" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630375 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630378 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630381 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630384 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630387 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630390 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630392 2575 flags.go:64] FLAG: --feature-gates="" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630396 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630399 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630402 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 14:53:48.633973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630405 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630408 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630411 2575 flags.go:64] FLAG: --help="false" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630414 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-140-93.ec2.internal" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630417 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630420 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630423 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630426 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630430 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630433 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630436 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630439 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630442 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630445 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630448 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630451 2575 flags.go:64] FLAG: --kube-reserved="" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630454 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630457 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630460 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630462 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630465 2575 flags.go:64] FLAG: --lock-file="" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630468 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630470 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630473 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 14:53:48.634596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630479 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630481 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630484 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630487 2575 flags.go:64] FLAG: --logging-format="text" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630489 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630493 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630495 2575 flags.go:64] FLAG: --manifest-url="" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630498 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630503 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630506 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630510 2575 flags.go:64] FLAG: --max-pods="110" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630512 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630515 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630518 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630521 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630524 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630527 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630530 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630537 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630540 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630543 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630547 2575 flags.go:64] FLAG: --pod-cidr="" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630549 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 14:53:48.635181 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630554 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630557 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630560 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630563 2575 flags.go:64] FLAG: --port="10250" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630566 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630569 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ef0aa70e62782c82" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630572 2575 flags.go:64] FLAG: --qos-reserved="" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630575 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630578 2575 flags.go:64] FLAG: --register-node="true" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630581 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630584 2575 flags.go:64] FLAG: --register-with-taints="" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630588 2575 flags.go:64] FLAG: --registry-burst="10" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630591 2575 flags.go:64] FLAG: --registry-qps="5" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630593 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630596 2575 flags.go:64] FLAG: --reserved-memory="" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630600 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630603 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630605 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630608 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630611 2575 flags.go:64] FLAG: --runonce="false" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630614 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630617 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630620 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630623 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630626 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630629 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 14:53:48.635717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630632 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630634 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630637 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630640 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630643 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630648 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630651 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630654 2575 flags.go:64] FLAG: --system-cgroups="" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630657 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630662 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630665 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630668 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630672 2575 flags.go:64] FLAG: --tls-min-version="" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630674 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630677 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630680 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630683 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630686 2575 flags.go:64] FLAG: --v="2" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630690 2575 flags.go:64] FLAG: --version="false" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630695 2575 flags.go:64] FLAG: --vmodule="" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630699 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.630702 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630795 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630799 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:48.636386 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630802 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630805 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630808 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630810 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630813 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630816 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630818 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630821 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630823 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630826 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630828 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630831 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630834 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630838 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630841 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630844 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630847 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630850 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630853 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630855 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:48.637406 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630858 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630860 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630863 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630865 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630868 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630870 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630873 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630876 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630879 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630881 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630884 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630886 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630889 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630892 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630894 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630897 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630899 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630902 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630904 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630907 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:48.637977 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630909 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630911 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630914 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630917 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630919 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630926 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630929 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630932 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630937 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630940 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630943 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630945 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630948 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630950 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630953 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630955 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630958 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630961 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630964 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630968 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:48.638475 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630971 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630973 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630976 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630978 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630981 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630984 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630986 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630989 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630991 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630994 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630996 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.630999 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631001 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631003 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631006 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631009 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631011 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631015 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631031 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:48.638956 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631040 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631045 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631049 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631052 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.631055 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.631060 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.638475 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.638490 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638543 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638548 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638553 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638557 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638560 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638563 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638566 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:48.639446 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638568 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638571 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638574 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638577 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638580 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638583 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638585 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638588 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638590 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638593 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638595 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638598 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638600 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638602 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638605 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638608 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638610 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638613 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638615 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638617 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:48.639825 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638620 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638622 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638625 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638628 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638631 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638634 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638637 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638639 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638641 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638644 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638646 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638649 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638651 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638654 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638658 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638662 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638665 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638669 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638672 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:48.640344 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638674 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638677 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638680 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638682 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638685 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638688 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638690 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638693 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638696 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638698 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638701 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638703 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638705 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638708 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638710 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638713 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638715 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638719 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638722 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638724 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638727 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:48.640819 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638729 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638732 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638735 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638737 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638740 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638743 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638745 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638748 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638750 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638753 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638756 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638758 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638761 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638763 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638765 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638768 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638770 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638773 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:48.641323 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638776 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.638781 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638877 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638882 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638886 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638889 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638892 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638895 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638897 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638900 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638903 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638906 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638908 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638911 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638914 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:53:48.641764 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638916 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638918 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638921 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638923 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638926 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638929 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638931 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638933 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638936 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638940 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638942 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638946 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638949 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638952 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638955 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638958 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638960 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638963 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638965 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638968 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:53:48.642182 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638971 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638973 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638976 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638978 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638981 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638983 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638985 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638988 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638991 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638993 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.638997 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639001 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639003 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639006 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639008 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639011 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639013 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639034 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639038 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639040 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:53:48.642663 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639043 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639046 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639050 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639052 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639054 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639057 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639060 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639062 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639064 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639067 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639069 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639072 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639074 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639077 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639079 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639082 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639084 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639087 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639089 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639092 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:53:48.643249 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639094 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639097 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639100 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639102 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639105 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639107 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639110 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639112 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639115 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639117 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639120 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639122 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:48.639125 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.639129 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:53:48.643729 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.639853 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 14:53:48.644164 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.644150 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 14:53:48.645058 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.645047 2575 server.go:1019] "Starting client certificate rotation" Apr 20 14:53:48.645170 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.645153 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:53:48.645202 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.645194 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:53:48.670484 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.670466 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:53:48.673086 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.673067 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:53:48.686292 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.686272 2575 log.go:25] "Validated CRI v1 runtime API" Apr 20 14:53:48.691955 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.691940 2575 log.go:25] "Validated CRI v1 image API" Apr 20 14:53:48.693164 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.693149 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 14:53:48.697373 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.697348 2575 fs.go:135] Filesystem UUIDs: map[7519464a-579a-4d95-9cc4-c9f62b6b9648:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 fe10d8f4-d5ba-4c19-82c3-1fac59449a87:/dev/nvme0n1p4] Apr 20 14:53:48.697442 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.697374 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 14:53:48.704206 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.703867 2575 manager.go:217] Machine: {Timestamp:2026-04-20 14:53:48.701941058 +0000 UTC m=+0.394484547 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100456 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2505e6c1551999fca6d65ca6bbef3d SystemUUID:ec2505e6-c155-1999-fca6-d65ca6bbef3d BootID:a3ec6877-9cd0-43fd-90be-fb12809c99f7 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:69:10:dd:7d:c5 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:69:10:dd:7d:c5 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9a:7c:0a:49:ea:2d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 14:53:48.704206 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.704194 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 14:53:48.704349 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.704299 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 14:53:48.704705 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.704685 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:53:48.705393 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.705362 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 14:53:48.705530 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.705397 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-93.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 14:53:48.705572 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.705540 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 14:53:48.705572 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.705550 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 14:53:48.705572 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.705562 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:53:48.706326 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.706315 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:53:48.707629 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.707619 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:53:48.707749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.707739 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 14:53:48.710252 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.710242 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 20 14:53:48.710286 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.710256 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 14:53:48.710286 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.710267 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 14:53:48.710286 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.710275 2575 kubelet.go:397] "Adding apiserver pod source" Apr 20 14:53:48.710286 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.710283 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 14:53:48.711494 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.711481 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:53:48.711543 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.711499 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:53:48.714788 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.714770 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 14:53:48.715908 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.715895 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 14:53:48.717231 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717215 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 14:53:48.717267 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717245 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 14:53:48.717267 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717257 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 14:53:48.717330 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717267 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 14:53:48.717330 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717280 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 14:53:48.717330 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717291 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 14:53:48.717330 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717301 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 14:53:48.717330 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717313 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 14:53:48.717330 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717324 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 14:53:48.717475 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717335 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 14:53:48.717475 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717353 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 14:53:48.717475 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.717370 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 14:53:48.718413 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.718401 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 14:53:48.718448 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.718416 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 14:53:48.722108 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.722096 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 14:53:48.722162 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.722129 2575 server.go:1295] "Started kubelet" Apr 20 14:53:48.722285 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.722228 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 14:53:48.722373 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.722300 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 14:53:48.722373 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.722292 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 14:53:48.722810 ip-10-0-140-93 systemd[1]: Started Kubernetes Kubelet. Apr 20 14:53:48.722960 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.722837 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-93.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 14:53:48.722960 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.722872 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 14:53:48.722960 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.722944 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-93.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 14:53:48.723295 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.723218 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 14:53:48.725305 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.725287 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 20 14:53:48.728807 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.728786 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 14:53:48.729219 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.729200 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 14:53:48.729967 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.729951 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 14:53:48.729967 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.729967 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 14:53:48.730154 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.730077 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 14:53:48.730154 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.730127 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 20 14:53:48.730154 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.730131 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 20 14:53:48.730339 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.730152 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:48.730339 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.730179 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pbpcd" Apr 20 14:53:48.730977 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.730827 2575 factory.go:153] Registering CRI-O factory Apr 20 14:53:48.731072 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.730987 2575 factory.go:223] Registration of the crio container factory successfully Apr 20 14:53:48.731072 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.730956 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 14:53:48.731188 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.731071 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-140-93.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 14:53:48.731188 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.731080 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 14:53:48.731188 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.731101 2575 factory.go:55] Registering systemd factory Apr 20 14:53:48.731188 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.731112 2575 factory.go:223] Registration of the systemd container factory successfully Apr 20 14:53:48.731188 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.731136 2575 factory.go:103] Registering Raw factory Apr 20 14:53:48.731188 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.731149 2575 manager.go:1196] Started watching for new ooms in manager Apr 20 14:53:48.731636 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.731619 2575 manager.go:319] Starting recovery of all containers Apr 20 14:53:48.731875 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.730896 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-93.ec2.internal.18a8185a358b5870 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-93.ec2.internal,UID:ip-10-0-140-93.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-93.ec2.internal,},FirstTimestamp:2026-04-20 14:53:48.722108528 +0000 UTC m=+0.414652018,LastTimestamp:2026-04-20 14:53:48.722108528 +0000 UTC m=+0.414652018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-93.ec2.internal,}" Apr 20 14:53:48.735042 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.734995 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 14:53:48.736402 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.736370 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pbpcd" Apr 20 14:53:48.742991 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.742978 2575 manager.go:324] Recovery completed Apr 20 14:53:48.747149 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.747136 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:48.749518 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.749500 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:48.749595 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.749530 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:48.749595 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.749541 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:48.749969 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.749957 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 14:53:48.750010 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.749971 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 14:53:48.750010 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.749990 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:53:48.752030 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.752005 2575 policy_none.go:49] "None policy: Start" Apr 20 14:53:48.752083 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.752039 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 14:53:48.752083 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.752058 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 20 14:53:48.791631 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.791615 2575 manager.go:341] "Starting Device Plugin manager" Apr 20 14:53:48.800345 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.791649 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 14:53:48.800345 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.791661 2575 server.go:85] "Starting device plugin registration server" Apr 20 14:53:48.800345 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.791895 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 14:53:48.800345 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.791907 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 14:53:48.800345 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.792007 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 14:53:48.800345 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.792099 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 14:53:48.800345 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.792105 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 14:53:48.800345 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.792617 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 14:53:48.800345 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.792656 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:48.862153 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.862119 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 14:53:48.863567 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.863536 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 14:53:48.863567 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.863563 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 14:53:48.863694 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.863583 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 14:53:48.863694 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.863590 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 14:53:48.863694 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.863620 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 14:53:48.865533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.865511 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:48.892634 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.892588 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:48.893689 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.893669 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:48.893775 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.893699 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:48.893775 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.893710 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:48.893775 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.893732 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-93.ec2.internal" Apr 20 14:53:48.901556 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.901540 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-93.ec2.internal" Apr 20 14:53:48.901635 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.901569 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-93.ec2.internal\": node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:48.920189 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.920163 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:48.964232 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.964213 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal"] Apr 20 14:53:48.964321 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.964273 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:48.965537 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.965523 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:48.965630 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.965553 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:48.965630 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.965567 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:48.966682 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.966667 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:48.966816 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.966803 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" Apr 20 14:53:48.966859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.966832 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:48.967318 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.967305 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:48.967402 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.967316 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:48.967402 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.967333 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:48.967402 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.967340 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:48.967402 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.967347 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:48.967402 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.967351 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:48.968436 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.968415 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal" Apr 20 14:53:48.968436 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.968439 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:53:48.969054 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.969038 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:53:48.969128 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.969070 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:53:48.969128 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:48.969085 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:53:48.987545 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.987528 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-93.ec2.internal\" not found" node="ip-10-0-140-93.ec2.internal" Apr 20 14:53:48.992136 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:48.992122 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-93.ec2.internal\" not found" node="ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.020404 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.020386 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.031410 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.031387 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b414fbe4806db5c9685d60534d8eee40-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal\" (UID: \"b414fbe4806db5c9685d60534d8eee40\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.031492 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.031418 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b414fbe4806db5c9685d60534d8eee40-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal\" (UID: \"b414fbe4806db5c9685d60534d8eee40\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.031492 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.031446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b1b2e50dbc39040c838976f73463d02-config\") pod \"kube-apiserver-proxy-ip-10-0-140-93.ec2.internal\" (UID: \"1b1b2e50dbc39040c838976f73463d02\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.120738 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.120717 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.132109 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.132093 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b414fbe4806db5c9685d60534d8eee40-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal\" (UID: \"b414fbe4806db5c9685d60534d8eee40\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.132109 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.132100 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/b414fbe4806db5c9685d60534d8eee40-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal\" (UID: \"b414fbe4806db5c9685d60534d8eee40\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.132194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.132125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b414fbe4806db5c9685d60534d8eee40-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal\" (UID: \"b414fbe4806db5c9685d60534d8eee40\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.132194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.132143 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b1b2e50dbc39040c838976f73463d02-config\") pod \"kube-apiserver-proxy-ip-10-0-140-93.ec2.internal\" (UID: \"1b1b2e50dbc39040c838976f73463d02\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.132194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.132167 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1b1b2e50dbc39040c838976f73463d02-config\") pod \"kube-apiserver-proxy-ip-10-0-140-93.ec2.internal\" (UID: \"1b1b2e50dbc39040c838976f73463d02\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.132194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.132188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b414fbe4806db5c9685d60534d8eee40-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal\" (UID: \"b414fbe4806db5c9685d60534d8eee40\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.221507 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.221442 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.289989 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.289965 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.294442 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.294426 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal" Apr 20 14:53:49.322077 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.322052 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.422613 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.422589 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.523201 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.523144 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.542047 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.542010 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:49.623713 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.623691 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.645226 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.645205 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 14:53:49.645327 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.645308 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:53:49.645394 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.645365 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:53:49.723794 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.723770 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.729713 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.729693 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 14:53:49.738004 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.737973 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:48:48 +0000 UTC" deadline="2027-10-05 20:43:22.222333135 +0000 UTC" Apr 20 14:53:49.738004 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.738001 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12797h49m32.484334696s" Apr 20 14:53:49.744929 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.744906 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:53:49.764554 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.764530 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4v5tr" Apr 20 14:53:49.772399 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.772378 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4v5tr" Apr 20 14:53:49.814385 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.814365 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:49.824519 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.824502 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.860186 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:49.860158 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b1b2e50dbc39040c838976f73463d02.slice/crio-54813eece40255e1396c1607c3172d62d2421aa058f93a81673ca99ae2875b0c WatchSource:0}: Error finding container 54813eece40255e1396c1607c3172d62d2421aa058f93a81673ca99ae2875b0c: Status 404 returned error can't find the container with id 54813eece40255e1396c1607c3172d62d2421aa058f93a81673ca99ae2875b0c Apr 20 14:53:49.860418 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:49.860397 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb414fbe4806db5c9685d60534d8eee40.slice/crio-d1455efcee1f51d1d2ed5e97f742d3f5a2cf74b938644dbb374506f715baf6b7 WatchSource:0}: Error finding container d1455efcee1f51d1d2ed5e97f742d3f5a2cf74b938644dbb374506f715baf6b7: Status 404 returned error can't find the container with id d1455efcee1f51d1d2ed5e97f742d3f5a2cf74b938644dbb374506f715baf6b7 Apr 20 14:53:49.864619 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.864596 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:53:49.867883 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.867837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal" event={"ID":"1b1b2e50dbc39040c838976f73463d02","Type":"ContainerStarted","Data":"54813eece40255e1396c1607c3172d62d2421aa058f93a81673ca99ae2875b0c"} Apr 20 14:53:49.868750 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.868729 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" event={"ID":"b414fbe4806db5c9685d60534d8eee40","Type":"ContainerStarted","Data":"d1455efcee1f51d1d2ed5e97f742d3f5a2cf74b938644dbb374506f715baf6b7"} Apr 20 14:53:49.925566 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:49.925540 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-93.ec2.internal\" not found" Apr 20 14:53:49.930962 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:49.930938 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:50.029812 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.029792 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" Apr 20 14:53:50.041267 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.041248 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:53:50.042353 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.042341 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal" Apr 20 14:53:50.051127 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.051110 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:53:50.606898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.606862 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:50.710921 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.710891 2575 apiserver.go:52] "Watching apiserver" Apr 20 14:53:50.717074 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.716872 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 14:53:50.718219 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.718196 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-pjv9g","openshift-ovn-kubernetes/ovnkube-node-4d2sl","kube-system/konnectivity-agent-vc6rt","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq","openshift-dns/node-resolver-2z95t","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal","openshift-multus/multus-bcj2f","kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal","openshift-cluster-node-tuning-operator/tuned-t5g6z","openshift-image-registry/node-ca-nwk2q","openshift-multus/multus-additional-cni-plugins-p9ckk","openshift-multus/network-metrics-daemon-gpcl9","openshift-network-diagnostics/network-check-target-548cc"] Apr 20 14:53:50.721281 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.721256 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.723474 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.723439 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.724353 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.724329 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 14:53:50.724353 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.724346 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 14:53:50.724561 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.724545 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 14:53:50.724741 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.724727 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 14:53:50.724985 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.724968 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-gwml9\"" Apr 20 14:53:50.728340 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.727693 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:53:50.728340 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.727790 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.729287 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.729261 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 14:53:50.729380 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.729301 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 14:53:50.729568 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.729553 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 14:53:50.729602 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.729579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-f2g4l\"" Apr 20 14:53:50.729776 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.729763 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 14:53:50.729901 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.729875 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 14:53:50.729901 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.729898 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 14:53:50.730206 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.730190 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 14:53:50.730858 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.730636 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 14:53:50.730858 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.730647 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h95c4\"" Apr 20 14:53:50.730858 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.730659 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 14:53:50.730858 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.730622 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 14:53:50.730858 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.730701 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5267v\"" Apr 20 14:53:50.730858 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.730740 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 14:53:50.732498 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.732475 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.734424 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.734390 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 14:53:50.734651 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.734617 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6rxz6\"" Apr 20 14:53:50.734749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.734730 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 14:53:50.734801 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.734783 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 14:53:50.735552 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.735532 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.740343 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.737556 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 14:53:50.740343 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.739644 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 14:53:50.740343 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.739871 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vk54s\"" Apr 20 14:53:50.740343 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.740128 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.740343 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.740129 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:50.740996 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.740848 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-sys-fs\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.740996 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.740895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcr8r\" (UniqueName: \"kubernetes.io/projected/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-kube-api-access-mcr8r\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.740996 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.740929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-modprobe-d\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.740996 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.740958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-sysctl-d\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.741227 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-run-systemd\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.741227 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-sys\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.741227 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8kh6\" (UniqueName: \"kubernetes.io/projected/e747e0ba-1751-42cf-bd02-3a475b905d71-kube-api-access-p8kh6\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.741374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741242 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-cni-dir\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.741374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1a5ae57-d482-4ab1-94d0-99811e6761ea-ovn-node-metrics-cert\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.741374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741314 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-conf-dir\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.741374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741338 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-etc-kubernetes\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.741562 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9jpz\" (UniqueName: \"kubernetes.io/projected/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-kube-api-access-b9jpz\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.741562 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741492 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-slash\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.741659 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741574 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-run\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.741659 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-socket-dir-parent\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.741758 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741664 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-cni-bin\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.741758 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-sysconfig\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.741758 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741747 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e674f45e-6036-47da-a806-c40040927fba-serviceca\") pod \"node-ca-nwk2q\" (UID: \"e674f45e-6036-47da-a806-c40040927fba\") " pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.741890 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741814 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-var-lib-cni-bin\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.741890 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741846 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1a5ae57-d482-4ab1-94d0-99811e6761ea-ovnkube-config\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.741890 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/842695c0-6492-4f06-9bbd-385652fc1969-konnectivity-ca\") pod \"konnectivity-agent-vc6rt\" (UID: \"842695c0-6492-4f06-9bbd-385652fc1969\") " pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:53:50.742054 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.741934 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-run-k8s-cni-cncf-io\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.742054 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-run-netns\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.742054 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742049 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-run-openvswitch\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.742227 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-registration-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.742227 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-sysctl-conf\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.742227 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742142 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5vbt\" (UniqueName: \"kubernetes.io/projected/e674f45e-6036-47da-a806-c40040927fba-kube-api-access-r5vbt\") pod \"node-ca-nwk2q\" (UID: \"e674f45e-6036-47da-a806-c40040927fba\") " pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.742382 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742244 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 14:53:50.742382 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-systemd-units\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.742382 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-var-lib-kubelet\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.742382 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-system-cni-dir\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.742382 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742380 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-cnibin\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.742635 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742400 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:53:50.742635 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742407 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-etc-openvswitch\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.742635 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.742635 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1a5ae57-d482-4ab1-94d0-99811e6761ea-env-overrides\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.742635 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1a5ae57-d482-4ab1-94d0-99811e6761ea-ovnkube-script-lib\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.742635 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6q6\" (UniqueName: \"kubernetes.io/projected/a1a5ae57-d482-4ab1-94d0-99811e6761ea-kube-api-access-5c6q6\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.742635 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-systemd\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.742998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742631 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e747e0ba-1751-42cf-bd02-3a475b905d71-tmp\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.742998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742679 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-socket-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.742998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742712 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-device-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.742998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742742 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-kubernetes\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.742998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742768 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.742998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742773 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-tuned\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.742998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-run-ovn\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.742998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742846 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gc8mh\"" Apr 20 14:53:50.742998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742925 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.742834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-hostroot\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743242 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-run-multus-certs\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743285 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-var-lib-openvswitch\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743322 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/842695c0-6492-4f06-9bbd-385652fc1969-agent-certs\") pod \"konnectivity-agent-vc6rt\" (UID: \"842695c0-6492-4f06-9bbd-385652fc1969\") " pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743354 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-host\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-cni-binary-copy\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743437 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-var-lib-kubelet\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.743477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-os-release\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e674f45e-6036-47da-a806-c40040927fba-host\") pod \"node-ca-nwk2q\" (UID: \"e674f45e-6036-47da-a806-c40040927fba\") " pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-lib-modules\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743526 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-var-lib-cni-multus\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-daemon-config\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743551 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-nzkkj\"" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-kubelet\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-etc-selinux\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743691 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-log-socket\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-node-log\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-cni-netd\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.743925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.743843 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-run-netns\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.745289 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.745266 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:53:50.745436 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.745419 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 14:53:50.745507 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.745493 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7pr4z\"" Apr 20 14:53:50.745706 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.745690 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 14:53:50.746249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.745947 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:50.746249 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:50.746050 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:53:50.749174 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.749153 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:50.749270 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:50.749237 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:53:50.773113 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.773090 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:48:49 +0000 UTC" deadline="2028-01-13 16:45:25.198407874 +0000 UTC" Apr 20 14:53:50.773113 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.773113 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15193h51m34.425297152s" Apr 20 14:53:50.831536 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.831507 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 14:53:50.844249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844219 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-node-log\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.844374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-cni-netd\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.844374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844303 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:50.844374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844332 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-run-netns\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.844374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-node-log\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.844374 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n74tj\" (UniqueName: \"kubernetes.io/projected/3004f37c-e216-4320-82a5-11c7c7fe8be1-kube-api-access-n74tj\") pod \"node-resolver-2z95t\" (UID: \"3004f37c-e216-4320-82a5-11c7c7fe8be1\") " pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844386 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-sys-fs\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-run-netns\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcr8r\" (UniqueName: \"kubernetes.io/projected/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-kube-api-access-mcr8r\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-cni-netd\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-modprobe-d\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-sys-fs\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-sysctl-d\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-run-systemd\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844560 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-run-systemd\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844575 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-sysctl-d\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.844617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844579 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-modprobe-d\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844645 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-sys\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8kh6\" (UniqueName: \"kubernetes.io/projected/e747e0ba-1751-42cf-bd02-3a475b905d71-kube-api-access-p8kh6\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-cni-dir\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-sys\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844851 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1a5ae57-d482-4ab1-94d0-99811e6761ea-ovn-node-metrics-cert\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvjlh\" (UniqueName: \"kubernetes.io/projected/ae2439f5-03aa-43b8-9466-c01fbcb53912-kube-api-access-mvjlh\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844932 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-cni-dir\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-conf-dir\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-conf-dir\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.844987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-etc-kubernetes\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9jpz\" (UniqueName: \"kubernetes.io/projected/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-kube-api-access-b9jpz\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845080 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-slash\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845106 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-run\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845077 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-etc-kubernetes\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845150 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-socket-dir-parent\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-slash\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845201 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-run\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-cni-bin\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-cni-bin\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845214 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-socket-dir-parent\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-sysconfig\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e674f45e-6036-47da-a806-c40040927fba-serviceca\") pod \"node-ca-nwk2q\" (UID: \"e674f45e-6036-47da-a806-c40040927fba\") " pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-var-lib-cni-bin\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1a5ae57-d482-4ab1-94d0-99811e6761ea-ovnkube-config\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-sysconfig\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845356 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3004f37c-e216-4320-82a5-11c7c7fe8be1-tmp-dir\") pod \"node-resolver-2z95t\" (UID: \"3004f37c-e216-4320-82a5-11c7c7fe8be1\") " pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28830e18-82bf-407f-957f-cbdbc463ca66-host-slash\") pod \"iptables-alerter-pjv9g\" (UID: \"28830e18-82bf-407f-957f-cbdbc463ca66\") " pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845387 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-var-lib-cni-bin\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hrcl\" (UniqueName: \"kubernetes.io/projected/28830e18-82bf-407f-957f-cbdbc463ca66-kube-api-access-6hrcl\") pod \"iptables-alerter-pjv9g\" (UID: \"28830e18-82bf-407f-957f-cbdbc463ca66\") " pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845457 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4735b84c-6b45-45bb-8802-627a40d45e62-cni-binary-copy\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845482 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/842695c0-6492-4f06-9bbd-385652fc1969-konnectivity-ca\") pod \"konnectivity-agent-vc6rt\" (UID: \"842695c0-6492-4f06-9bbd-385652fc1969\") " pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-run-k8s-cni-cncf-io\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.845734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-run-netns\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845563 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-run-openvswitch\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-os-release\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845629 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4735b84c-6b45-45bb-8802-627a40d45e62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845652 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-run-k8s-cni-cncf-io\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4735b84c-6b45-45bb-8802-627a40d45e62-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-registration-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845762 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-sysctl-conf\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5vbt\" (UniqueName: \"kubernetes.io/projected/e674f45e-6036-47da-a806-c40040927fba-kube-api-access-r5vbt\") pod \"node-ca-nwk2q\" (UID: \"e674f45e-6036-47da-a806-c40040927fba\") " pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-systemd-units\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845840 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3004f37c-e216-4320-82a5-11c7c7fe8be1-hosts-file\") pod \"node-resolver-2z95t\" (UID: \"3004f37c-e216-4320-82a5-11c7c7fe8be1\") " pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845859 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-run-openvswitch\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845863 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbgmf\" (UniqueName: \"kubernetes.io/projected/4735b84c-6b45-45bb-8802-627a40d45e62-kube-api-access-wbgmf\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a1a5ae57-d482-4ab1-94d0-99811e6761ea-ovnkube-config\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-var-lib-kubelet\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-system-cni-dir\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845991 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-registration-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.846575 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.845995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-cnibin\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846044 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-etc-openvswitch\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-sysctl-conf\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-var-lib-kubelet\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-system-cni-dir\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846141 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-run-netns\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-systemd-units\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-cnibin\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-etc-openvswitch\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846268 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1a5ae57-d482-4ab1-94d0-99811e6761ea-env-overrides\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846308 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1a5ae57-d482-4ab1-94d0-99811e6761ea-ovnkube-script-lib\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846329 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6q6\" (UniqueName: \"kubernetes.io/projected/a1a5ae57-d482-4ab1-94d0-99811e6761ea-kube-api-access-5c6q6\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846353 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-systemd\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e674f45e-6036-47da-a806-c40040927fba-serviceca\") pod \"node-ca-nwk2q\" (UID: \"e674f45e-6036-47da-a806-c40040927fba\") " pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846379 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e747e0ba-1751-42cf-bd02-3a475b905d71-tmp\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-socket-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.847376 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846432 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-device-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-kubernetes\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-systemd\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-tuned\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-kubernetes\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-run-ovn\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-run-ovn\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846616 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a1a5ae57-d482-4ab1-94d0-99811e6761ea-env-overrides\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-hostroot\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-run-multus-certs\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-var-lib-openvswitch\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/842695c0-6492-4f06-9bbd-385652fc1969-agent-certs\") pod \"konnectivity-agent-vc6rt\" (UID: \"842695c0-6492-4f06-9bbd-385652fc1969\") " pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846753 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-host\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-cni-binary-copy\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-var-lib-kubelet\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-device-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.848194 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-os-release\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846867 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28830e18-82bf-407f-957f-cbdbc463ca66-iptables-alerter-script\") pod \"iptables-alerter-pjv9g\" (UID: \"28830e18-82bf-407f-957f-cbdbc463ca66\") " pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-cnibin\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846920 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e674f45e-6036-47da-a806-c40040927fba-host\") pod \"node-ca-nwk2q\" (UID: \"e674f45e-6036-47da-a806-c40040927fba\") " pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/842695c0-6492-4f06-9bbd-385652fc1969-konnectivity-ca\") pod \"konnectivity-agent-vc6rt\" (UID: \"842695c0-6492-4f06-9bbd-385652fc1969\") " pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846947 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-lib-modules\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846949 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a1a5ae57-d482-4ab1-94d0-99811e6761ea-ovnkube-script-lib\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-socket-dir\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847011 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-var-lib-kubelet\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847063 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e674f45e-6036-47da-a806-c40040927fba-host\") pod \"node-ca-nwk2q\" (UID: \"e674f45e-6036-47da-a806-c40040927fba\") " pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847101 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-os-release\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.846605 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-hostroot\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-lib-modules\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847245 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-run-multus-certs\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e747e0ba-1751-42cf-bd02-3a475b905d71-host\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-var-lib-openvswitch\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847541 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-var-lib-cni-multus\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.849306 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-daemon-config\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847591 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-kubelet\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-etc-selinux\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-log-socket\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847658 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-system-cni-dir\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-log-socket\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847865 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-kubelet\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a1a5ae57-d482-4ab1-94d0-99811e6761ea-host-run-ovn-kubernetes\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847901 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-host-var-lib-cni-multus\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.847978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-etc-selinux\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.848080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-cni-binary-copy\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.849268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a1a5ae57-d482-4ab1-94d0-99811e6761ea-ovn-node-metrics-cert\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.849386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-multus-daemon-config\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.849731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e747e0ba-1751-42cf-bd02-3a475b905d71-etc-tuned\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.849970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e747e0ba-1751-42cf-bd02-3a475b905d71-tmp\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.850070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.849969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/842695c0-6492-4f06-9bbd-385652fc1969-agent-certs\") pod \"konnectivity-agent-vc6rt\" (UID: \"842695c0-6492-4f06-9bbd-385652fc1969\") " pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:53:50.859141 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.859090 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8kh6\" (UniqueName: \"kubernetes.io/projected/e747e0ba-1751-42cf-bd02-3a475b905d71-kube-api-access-p8kh6\") pod \"tuned-t5g6z\" (UID: \"e747e0ba-1751-42cf-bd02-3a475b905d71\") " pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:50.859237 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.859170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcr8r\" (UniqueName: \"kubernetes.io/projected/bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b-kube-api-access-mcr8r\") pod \"aws-ebs-csi-driver-node-b79nq\" (UID: \"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:50.859474 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.859450 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6q6\" (UniqueName: \"kubernetes.io/projected/a1a5ae57-d482-4ab1-94d0-99811e6761ea-kube-api-access-5c6q6\") pod \"ovnkube-node-4d2sl\" (UID: \"a1a5ae57-d482-4ab1-94d0-99811e6761ea\") " pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:50.860001 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.859980 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5vbt\" (UniqueName: \"kubernetes.io/projected/e674f45e-6036-47da-a806-c40040927fba-kube-api-access-r5vbt\") pod \"node-ca-nwk2q\" (UID: \"e674f45e-6036-47da-a806-c40040927fba\") " pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:50.860105 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.859990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9jpz\" (UniqueName: \"kubernetes.io/projected/37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1-kube-api-access-b9jpz\") pod \"multus-bcj2f\" (UID: \"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1\") " pod="openshift-multus/multus-bcj2f" Apr 20 14:53:50.948973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.948940 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3004f37c-e216-4320-82a5-11c7c7fe8be1-tmp-dir\") pod \"node-resolver-2z95t\" (UID: \"3004f37c-e216-4320-82a5-11c7c7fe8be1\") " pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:50.948973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.948972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28830e18-82bf-407f-957f-cbdbc463ca66-host-slash\") pod \"iptables-alerter-pjv9g\" (UID: \"28830e18-82bf-407f-957f-cbdbc463ca66\") " pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.949213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.948987 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hrcl\" (UniqueName: \"kubernetes.io/projected/28830e18-82bf-407f-957f-cbdbc463ca66-kube-api-access-6hrcl\") pod \"iptables-alerter-pjv9g\" (UID: \"28830e18-82bf-407f-957f-cbdbc463ca66\") " pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.949213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949007 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4735b84c-6b45-45bb-8802-627a40d45e62-cni-binary-copy\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949049 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-os-release\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949069 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/28830e18-82bf-407f-957f-cbdbc463ca66-host-slash\") pod \"iptables-alerter-pjv9g\" (UID: \"28830e18-82bf-407f-957f-cbdbc463ca66\") " pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.949213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4735b84c-6b45-45bb-8802-627a40d45e62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4735b84c-6b45-45bb-8802-627a40d45e62-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949158 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-os-release\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3004f37c-e216-4320-82a5-11c7c7fe8be1-hosts-file\") pod \"node-resolver-2z95t\" (UID: \"3004f37c-e216-4320-82a5-11c7c7fe8be1\") " pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:50.949533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbgmf\" (UniqueName: \"kubernetes.io/projected/4735b84c-6b45-45bb-8802-627a40d45e62-kube-api-access-wbgmf\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28830e18-82bf-407f-957f-cbdbc463ca66-iptables-alerter-script\") pod \"iptables-alerter-pjv9g\" (UID: \"28830e18-82bf-407f-957f-cbdbc463ca66\") " pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.949533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3004f37c-e216-4320-82a5-11c7c7fe8be1-tmp-dir\") pod \"node-resolver-2z95t\" (UID: \"3004f37c-e216-4320-82a5-11c7c7fe8be1\") " pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:50.949533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949438 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-cnibin\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949505 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-system-cni-dir\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949536 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-cnibin\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949548 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4735b84c-6b45-45bb-8802-627a40d45e62-system-cni-dir\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949604 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n74tj\" (UniqueName: \"kubernetes.io/projected/3004f37c-e216-4320-82a5-11c7c7fe8be1-kube-api-access-n74tj\") pod \"node-resolver-2z95t\" (UID: \"3004f37c-e216-4320-82a5-11c7c7fe8be1\") " pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4735b84c-6b45-45bb-8802-627a40d45e62-cni-binary-copy\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvjlh\" (UniqueName: \"kubernetes.io/projected/ae2439f5-03aa-43b8-9466-c01fbcb53912-kube-api-access-mvjlh\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3004f37c-e216-4320-82a5-11c7c7fe8be1-hosts-file\") pod \"node-resolver-2z95t\" (UID: \"3004f37c-e216-4320-82a5-11c7c7fe8be1\") " pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4735b84c-6b45-45bb-8802-627a40d45e62-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:50.949786 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949834 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/28830e18-82bf-407f-957f-cbdbc463ca66-iptables-alerter-script\") pod \"iptables-alerter-pjv9g\" (UID: \"28830e18-82bf-407f-957f-cbdbc463ca66\") " pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.949856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4735b84c-6b45-45bb-8802-627a40d45e62-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.949898 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:50.949884 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs podName:ae2439f5-03aa-43b8-9466-c01fbcb53912 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:51.449839944 +0000 UTC m=+3.142383441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs") pod "network-metrics-daemon-gpcl9" (UID: "ae2439f5-03aa-43b8-9466-c01fbcb53912") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:50.956701 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:50.956679 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:50.956701 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:50.956702 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:50.956897 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:50.956713 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4gxs8 for pod openshift-network-diagnostics/network-check-target-548cc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:50.956897 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:50.956778 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8 podName:94a902fd-00c5-4c9a-867d-96a0e32a66c1 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:51.456760769 +0000 UTC m=+3.149304265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4gxs8" (UniqueName: "kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8") pod "network-check-target-548cc" (UID: "94a902fd-00c5-4c9a-867d-96a0e32a66c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:50.959597 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.959570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvjlh\" (UniqueName: \"kubernetes.io/projected/ae2439f5-03aa-43b8-9466-c01fbcb53912-kube-api-access-mvjlh\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:50.959597 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.959592 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hrcl\" (UniqueName: \"kubernetes.io/projected/28830e18-82bf-407f-957f-cbdbc463ca66-kube-api-access-6hrcl\") pod \"iptables-alerter-pjv9g\" (UID: \"28830e18-82bf-407f-957f-cbdbc463ca66\") " pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:50.959796 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.959706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbgmf\" (UniqueName: \"kubernetes.io/projected/4735b84c-6b45-45bb-8802-627a40d45e62-kube-api-access-wbgmf\") pod \"multus-additional-cni-plugins-p9ckk\" (UID: \"4735b84c-6b45-45bb-8802-627a40d45e62\") " pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:50.959886 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:50.959870 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n74tj\" (UniqueName: \"kubernetes.io/projected/3004f37c-e216-4320-82a5-11c7c7fe8be1-kube-api-access-n74tj\") pod \"node-resolver-2z95t\" (UID: \"3004f37c-e216-4320-82a5-11c7c7fe8be1\") " pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:51.041544 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.041513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bcj2f" Apr 20 14:53:51.051286 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.051260 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:53:51.060922 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.060898 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:53:51.064527 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.064506 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" Apr 20 14:53:51.072058 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.072010 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nwk2q" Apr 20 14:53:51.077567 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.077549 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" Apr 20 14:53:51.083115 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.083093 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" Apr 20 14:53:51.090629 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.090613 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2z95t" Apr 20 14:53:51.097106 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.097088 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pjv9g" Apr 20 14:53:51.257484 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.257413 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:53:51.452867 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.452834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:51.453044 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:51.452974 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:51.453124 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:51.453059 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs podName:ae2439f5-03aa-43b8-9466-c01fbcb53912 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:52.453039831 +0000 UTC m=+4.145583308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs") pod "network-metrics-daemon-gpcl9" (UID: "ae2439f5-03aa-43b8-9466-c01fbcb53912") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:51.544275 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:51.544119 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37f9fe2f_e8a7_4419_9f1d_a937dfadf1c1.slice/crio-1fe6b0145af1d0c0307c91e98b7272fecebc90d7d471ead1a0bd5b10a3dc7ce8 WatchSource:0}: Error finding container 1fe6b0145af1d0c0307c91e98b7272fecebc90d7d471ead1a0bd5b10a3dc7ce8: Status 404 returned error can't find the container with id 1fe6b0145af1d0c0307c91e98b7272fecebc90d7d471ead1a0bd5b10a3dc7ce8 Apr 20 14:53:51.545266 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:51.545241 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod842695c0_6492_4f06_9bbd_385652fc1969.slice/crio-bc3414d7daa1f8e9b579417fc4bae80040e49ac0040ba67007957d57c294c525 WatchSource:0}: Error finding container bc3414d7daa1f8e9b579417fc4bae80040e49ac0040ba67007957d57c294c525: Status 404 returned error can't find the container with id bc3414d7daa1f8e9b579417fc4bae80040e49ac0040ba67007957d57c294c525 Apr 20 14:53:51.547399 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:51.547324 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a5ae57_d482_4ab1_94d0_99811e6761ea.slice/crio-60041b09fa64508f4ee56f2fe6e7e48a67d05fe4834b37cbaa8a18c8ae2c43e3 WatchSource:0}: Error finding container 60041b09fa64508f4ee56f2fe6e7e48a67d05fe4834b37cbaa8a18c8ae2c43e3: Status 404 returned error can't find the container with id 60041b09fa64508f4ee56f2fe6e7e48a67d05fe4834b37cbaa8a18c8ae2c43e3 Apr 20 14:53:51.549723 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:51.549691 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode674f45e_6036_47da_a806_c40040927fba.slice/crio-27bc82ea779afd21be1b942e919509ae93b83e7f692b1942cb460e78f6cb1612 WatchSource:0}: Error finding container 27bc82ea779afd21be1b942e919509ae93b83e7f692b1942cb460e78f6cb1612: Status 404 returned error can't find the container with id 27bc82ea779afd21be1b942e919509ae93b83e7f692b1942cb460e78f6cb1612 Apr 20 14:53:51.550645 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:53:51.550625 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode747e0ba_1751_42cf_bd02_3a475b905d71.slice/crio-17d784dc41e03de87ec207928254b3a2ca452c5624bcd67928c1f835eec29217 WatchSource:0}: Error finding container 17d784dc41e03de87ec207928254b3a2ca452c5624bcd67928c1f835eec29217: Status 404 returned error can't find the container with id 17d784dc41e03de87ec207928254b3a2ca452c5624bcd67928c1f835eec29217 Apr 20 14:53:51.553276 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.553165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:51.553276 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:51.553288 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:51.553425 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:51.553304 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:51.553425 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:51.553316 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4gxs8 for pod openshift-network-diagnostics/network-check-target-548cc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:51.553425 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:51.553365 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8 podName:94a902fd-00c5-4c9a-867d-96a0e32a66c1 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:52.553347427 +0000 UTC m=+4.245890924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gxs8" (UniqueName: "kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8") pod "network-check-target-548cc" (UID: "94a902fd-00c5-4c9a-867d-96a0e32a66c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:51.773853 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.773810 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:48:49 +0000 UTC" deadline="2027-09-26 13:58:43.962850781 +0000 UTC" Apr 20 14:53:51.774214 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.773859 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12575h4m52.189007236s" Apr 20 14:53:51.873092 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.872975 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal" event={"ID":"1b1b2e50dbc39040c838976f73463d02","Type":"ContainerStarted","Data":"97fb4c3e9c6e902ec62e4ab0a25e10e313744e1fbbef80404814becb324db5c9"} Apr 20 14:53:51.874065 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.874005 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pjv9g" event={"ID":"28830e18-82bf-407f-957f-cbdbc463ca66","Type":"ContainerStarted","Data":"3ce45ffe9c7bf9d3fb9cfc6e968cac8762e81dd492f1a74c271521593e58a30f"} Apr 20 14:53:51.875033 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.874984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2z95t" event={"ID":"3004f37c-e216-4320-82a5-11c7c7fe8be1","Type":"ContainerStarted","Data":"5d810a5eb54e74d5015661ce5ced69819efd44f0d318c9c79ed3efe977e7b058"} Apr 20 14:53:51.875920 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.875898 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" event={"ID":"4735b84c-6b45-45bb-8802-627a40d45e62","Type":"ContainerStarted","Data":"1a67376df311ee274fa47c2b9caa4fb3568a8a959c870306bbf267038b32c7ce"} Apr 20 14:53:51.876961 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.876939 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" event={"ID":"e747e0ba-1751-42cf-bd02-3a475b905d71","Type":"ContainerStarted","Data":"17d784dc41e03de87ec207928254b3a2ca452c5624bcd67928c1f835eec29217"} Apr 20 14:53:51.877858 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.877837 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nwk2q" event={"ID":"e674f45e-6036-47da-a806-c40040927fba","Type":"ContainerStarted","Data":"27bc82ea779afd21be1b942e919509ae93b83e7f692b1942cb460e78f6cb1612"} Apr 20 14:53:51.878822 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.878805 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerStarted","Data":"60041b09fa64508f4ee56f2fe6e7e48a67d05fe4834b37cbaa8a18c8ae2c43e3"} Apr 20 14:53:51.879764 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.879747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vc6rt" event={"ID":"842695c0-6492-4f06-9bbd-385652fc1969","Type":"ContainerStarted","Data":"bc3414d7daa1f8e9b579417fc4bae80040e49ac0040ba67007957d57c294c525"} Apr 20 14:53:51.880636 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.880618 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" event={"ID":"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b","Type":"ContainerStarted","Data":"7580611e9c5caf8156e3759eb3e6ad327db61617102add3b16e3205e72502010"} Apr 20 14:53:51.881456 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:51.881440 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bcj2f" event={"ID":"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1","Type":"ContainerStarted","Data":"1fe6b0145af1d0c0307c91e98b7272fecebc90d7d471ead1a0bd5b10a3dc7ce8"} Apr 20 14:53:52.461150 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:52.460838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:52.461150 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:52.461007 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:52.461150 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:52.461090 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs podName:ae2439f5-03aa-43b8-9466-c01fbcb53912 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:54.461070605 +0000 UTC m=+6.153614084 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs") pod "network-metrics-daemon-gpcl9" (UID: "ae2439f5-03aa-43b8-9466-c01fbcb53912") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:52.561957 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:52.561913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:52.562156 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:52.562122 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:52.562156 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:52.562140 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:52.562156 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:52.562153 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4gxs8 for pod openshift-network-diagnostics/network-check-target-548cc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:52.562321 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:52.562210 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8 podName:94a902fd-00c5-4c9a-867d-96a0e32a66c1 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:54.562190563 +0000 UTC m=+6.254734053 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gxs8" (UniqueName: "kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8") pod "network-check-target-548cc" (UID: "94a902fd-00c5-4c9a-867d-96a0e32a66c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:52.867090 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:52.867058 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:52.867531 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:52.867182 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:53:52.867655 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:52.867601 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:52.867724 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:52.867704 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:53:52.897656 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:52.897623 2575 generic.go:358] "Generic (PLEG): container finished" podID="b414fbe4806db5c9685d60534d8eee40" containerID="ed48b3b286d29c347686d700cf39a085e801f2bdcd67e81c867db20b6422d01a" exitCode=0 Apr 20 14:53:52.898559 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:52.898533 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" event={"ID":"b414fbe4806db5c9685d60534d8eee40","Type":"ContainerDied","Data":"ed48b3b286d29c347686d700cf39a085e801f2bdcd67e81c867db20b6422d01a"} Apr 20 14:53:52.916006 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:52.915569 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-93.ec2.internal" podStartSLOduration=2.915552351 podStartE2EDuration="2.915552351s" podCreationTimestamp="2026-04-20 14:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:53:51.885247005 +0000 UTC m=+3.577790505" watchObservedRunningTime="2026-04-20 14:53:52.915552351 +0000 UTC m=+4.608095853" Apr 20 14:53:53.927757 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:53.927717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" event={"ID":"b414fbe4806db5c9685d60534d8eee40","Type":"ContainerStarted","Data":"a9e9071b3cc24b0df1885e35e323c8ca9c1e985158aa2cb716ef3478d38f92ff"} Apr 20 14:53:54.477297 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:54.476973 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:54.477297 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:54.477128 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:54.477297 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:54.477183 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs podName:ae2439f5-03aa-43b8-9466-c01fbcb53912 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:58.477169546 +0000 UTC m=+10.169713023 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs") pod "network-metrics-daemon-gpcl9" (UID: "ae2439f5-03aa-43b8-9466-c01fbcb53912") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:54.578508 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:54.577790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:54.578508 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:54.577982 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:54.578508 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:54.578000 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:54.578508 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:54.578011 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4gxs8 for pod openshift-network-diagnostics/network-check-target-548cc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:54.578508 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:54.578095 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8 podName:94a902fd-00c5-4c9a-867d-96a0e32a66c1 nodeName:}" failed. No retries permitted until 2026-04-20 14:53:58.578067273 +0000 UTC m=+10.270610763 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gxs8" (UniqueName: "kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8") pod "network-check-target-548cc" (UID: "94a902fd-00c5-4c9a-867d-96a0e32a66c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:54.864033 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:54.863980 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:54.864221 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:54.864068 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:54.864221 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:54.864172 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:53:54.864507 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:54.864453 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:53:56.867265 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:56.866614 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:56.867265 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:56.866746 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:53:56.867265 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:56.867134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:56.867265 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:56.867216 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:53:58.511484 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:58.510932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:58.511484 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:58.511091 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:58.511484 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:58.511158 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs podName:ae2439f5-03aa-43b8-9466-c01fbcb53912 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:06.511138215 +0000 UTC m=+18.203681715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs") pod "network-metrics-daemon-gpcl9" (UID: "ae2439f5-03aa-43b8-9466-c01fbcb53912") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:53:58.612314 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:58.612281 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:58.612465 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:58.612391 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:53:58.612465 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:58.612409 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:53:58.612465 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:58.612419 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4gxs8 for pod openshift-network-diagnostics/network-check-target-548cc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:58.612465 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:58.612467 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8 podName:94a902fd-00c5-4c9a-867d-96a0e32a66c1 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:06.612454469 +0000 UTC m=+18.304997946 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gxs8" (UniqueName: "kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8") pod "network-check-target-548cc" (UID: "94a902fd-00c5-4c9a-867d-96a0e32a66c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:53:58.865862 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:58.865208 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:53:58.865862 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:53:58.865241 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:53:58.865862 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:58.865334 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:53:58.865862 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:53:58.865809 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:00.866274 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:00.864561 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:00.866274 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:00.864676 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:00.866274 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:00.864571 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:00.866274 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:00.865049 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:02.864496 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:02.864461 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:02.865055 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:02.864572 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:02.865055 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:02.864670 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:02.865055 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:02.864792 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:04.864149 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:04.864112 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:04.864602 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:04.864122 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:04.864602 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:04.864224 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:04.864602 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:04.864314 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:06.570671 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:06.570608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:06.571141 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:06.570735 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:54:06.571141 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:06.570814 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs podName:ae2439f5-03aa-43b8-9466-c01fbcb53912 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:22.570791969 +0000 UTC m=+34.263335450 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs") pod "network-metrics-daemon-gpcl9" (UID: "ae2439f5-03aa-43b8-9466-c01fbcb53912") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:54:06.671609 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:06.671570 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:06.671783 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:06.671758 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:54:06.671783 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:06.671782 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:54:06.671869 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:06.671796 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4gxs8 for pod openshift-network-diagnostics/network-check-target-548cc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:54:06.671869 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:06.671858 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8 podName:94a902fd-00c5-4c9a-867d-96a0e32a66c1 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:22.671839809 +0000 UTC m=+34.364383302 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gxs8" (UniqueName: "kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8") pod "network-check-target-548cc" (UID: "94a902fd-00c5-4c9a-867d-96a0e32a66c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:54:06.864273 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:06.864188 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:06.864444 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:06.864199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:06.864444 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:06.864319 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:06.864444 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:06.864425 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:08.865178 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:08.865145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:08.865596 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:08.865234 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:08.865596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:08.865330 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:08.865596 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:08.865456 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:09.961902 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.961680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2z95t" event={"ID":"3004f37c-e216-4320-82a5-11c7c7fe8be1","Type":"ContainerStarted","Data":"c0143eea5916cb181e3290d0f4c0663525ba82889506e5931bd784e7ff8181d5"} Apr 20 14:54:09.963083 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.963060 2575 generic.go:358] "Generic (PLEG): container finished" podID="4735b84c-6b45-45bb-8802-627a40d45e62" containerID="104ad32fe47710310ff2f6f288e878240d3e299b359be25b3b60408f291a16ec" exitCode=0 Apr 20 14:54:09.963188 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.963122 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" event={"ID":"4735b84c-6b45-45bb-8802-627a40d45e62","Type":"ContainerDied","Data":"104ad32fe47710310ff2f6f288e878240d3e299b359be25b3b60408f291a16ec"} Apr 20 14:54:09.964258 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.964232 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" event={"ID":"e747e0ba-1751-42cf-bd02-3a475b905d71","Type":"ContainerStarted","Data":"f73eb1e984c388c54ab99e0f8fd29e183096697b4d8b195b118af6d09d6af292"} Apr 20 14:54:09.965524 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.965503 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nwk2q" event={"ID":"e674f45e-6036-47da-a806-c40040927fba","Type":"ContainerStarted","Data":"8fe199ca717d06b9331d1ba363cfdb018d8654671873e6c1e22d2aecc244f9b9"} Apr 20 14:54:09.967717 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.967693 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 14:54:09.968003 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.967985 2575 generic.go:358] "Generic (PLEG): container finished" podID="a1a5ae57-d482-4ab1-94d0-99811e6761ea" containerID="58365d6f353b273d7a072b73da607cf5cd0e6af5c1de5d16f2b4b85e76157f5a" exitCode=1 Apr 20 14:54:09.968079 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.968066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerStarted","Data":"057b3a0efb5b1882774ee016d9b3025287df5017917936ab5f79b03c7be4d1be"} Apr 20 14:54:09.968123 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.968084 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerStarted","Data":"e7d5433eaf507ece1fcc6d366f243f65341575e0a2847ff1eb292c234ed0bd17"} Apr 20 14:54:09.968123 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.968092 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerStarted","Data":"0a4a3b67686ff5441ea69cd598d349162b0550a88a22a352e9450c99d5049ff1"} Apr 20 14:54:09.968123 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.968100 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerDied","Data":"58365d6f353b273d7a072b73da607cf5cd0e6af5c1de5d16f2b4b85e76157f5a"} Apr 20 14:54:09.968123 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.968110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerStarted","Data":"fadb145d880f9c8fb54ca6c650eee7b81b2b645d2203b4a09c31894df7c5c4aa"} Apr 20 14:54:09.969261 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.969242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vc6rt" event={"ID":"842695c0-6492-4f06-9bbd-385652fc1969","Type":"ContainerStarted","Data":"07e8f61641e7e6163a80b6c443dacfc8d0c9bb587a15fa607ed9910064dbdf64"} Apr 20 14:54:09.970401 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.970382 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" event={"ID":"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b","Type":"ContainerStarted","Data":"e086e18ce02bbf1b8a50e2b090159913fc42f539e464cf9acacf5148d5cba8bd"} Apr 20 14:54:09.971765 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.971746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bcj2f" event={"ID":"37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1","Type":"ContainerStarted","Data":"3b63a859c6738c22f2e57f6a020869eb5e645951a0506ff9b6bbcfdf5c219ff9"} Apr 20 14:54:09.980373 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.980334 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2z95t" podStartSLOduration=3.46096712 podStartE2EDuration="20.98032388s" podCreationTimestamp="2026-04-20 14:53:49 +0000 UTC" firstStartedPulling="2026-04-20 14:53:51.576528718 +0000 UTC m=+3.269072199" lastFinishedPulling="2026-04-20 14:54:09.095885477 +0000 UTC m=+20.788428959" observedRunningTime="2026-04-20 14:54:09.979828068 +0000 UTC m=+21.672371566" watchObservedRunningTime="2026-04-20 14:54:09.98032388 +0000 UTC m=+21.672867379" Apr 20 14:54:09.980450 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.980402 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-93.ec2.internal" podStartSLOduration=19.980398417 podStartE2EDuration="19.980398417s" podCreationTimestamp="2026-04-20 14:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:53:53.958339471 +0000 UTC m=+5.650882972" watchObservedRunningTime="2026-04-20 14:54:09.980398417 +0000 UTC m=+21.672941917" Apr 20 14:54:09.993090 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:09.993057 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nwk2q" podStartSLOduration=4.398858834 podStartE2EDuration="21.993046883s" podCreationTimestamp="2026-04-20 14:53:48 +0000 UTC" firstStartedPulling="2026-04-20 14:53:51.552767836 +0000 UTC m=+3.245311325" lastFinishedPulling="2026-04-20 14:54:09.146955882 +0000 UTC m=+20.839499374" observedRunningTime="2026-04-20 14:54:09.99259929 +0000 UTC m=+21.685142788" watchObservedRunningTime="2026-04-20 14:54:09.993046883 +0000 UTC m=+21.685590382" Apr 20 14:54:10.028008 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.027970 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bcj2f" podStartSLOduration=4.411955228 podStartE2EDuration="22.027961272s" podCreationTimestamp="2026-04-20 14:53:48 +0000 UTC" firstStartedPulling="2026-04-20 14:53:51.546409187 +0000 UTC m=+3.238952679" lastFinishedPulling="2026-04-20 14:54:09.162415242 +0000 UTC m=+20.854958723" observedRunningTime="2026-04-20 14:54:10.027459764 +0000 UTC m=+21.720003263" watchObservedRunningTime="2026-04-20 14:54:10.027961272 +0000 UTC m=+21.720504771" Apr 20 14:54:10.040939 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.040898 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vc6rt" podStartSLOduration=4.440802549 podStartE2EDuration="22.040888613s" podCreationTimestamp="2026-04-20 14:53:48 +0000 UTC" firstStartedPulling="2026-04-20 14:53:51.546871004 +0000 UTC m=+3.239414490" lastFinishedPulling="2026-04-20 14:54:09.146957063 +0000 UTC m=+20.839500554" observedRunningTime="2026-04-20 14:54:10.040647477 +0000 UTC m=+21.733190978" watchObservedRunningTime="2026-04-20 14:54:10.040888613 +0000 UTC m=+21.733432111" Apr 20 14:54:10.056477 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.056441 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-t5g6z" podStartSLOduration=4.459730819 podStartE2EDuration="22.056430911s" podCreationTimestamp="2026-04-20 14:53:48 +0000 UTC" firstStartedPulling="2026-04-20 14:53:51.553593376 +0000 UTC m=+3.246136863" lastFinishedPulling="2026-04-20 14:54:09.150293464 +0000 UTC m=+20.842836955" observedRunningTime="2026-04-20 14:54:10.056363225 +0000 UTC m=+21.748906725" watchObservedRunningTime="2026-04-20 14:54:10.056430911 +0000 UTC m=+21.748974430" Apr 20 14:54:10.864562 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.864330 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:10.864562 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:10.864456 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:10.864562 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.864465 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:10.864562 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:10.864535 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:10.866455 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.866403 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 14:54:10.976883 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.976804 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 14:54:10.977311 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.977238 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerStarted","Data":"723b580e9aed561b3cc59eb12944ac0f130ab1be2a06ac6b095478e38dcff34f"} Apr 20 14:54:10.978972 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.978936 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" event={"ID":"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b","Type":"ContainerStarted","Data":"131d9fa4531f3b63289d1014a423a3e8ab2192911e79c7b6d42eed53aadbc593"} Apr 20 14:54:10.980585 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.980560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pjv9g" event={"ID":"28830e18-82bf-407f-957f-cbdbc463ca66","Type":"ContainerStarted","Data":"0235fd2eccad85e39136061b57b612f190296fd451d345dc7fb72283796e10eb"} Apr 20 14:54:10.995007 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:10.994960 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pjv9g" podStartSLOduration=4.475515743 podStartE2EDuration="21.994945311s" podCreationTimestamp="2026-04-20 14:53:49 +0000 UTC" firstStartedPulling="2026-04-20 14:53:51.576461697 +0000 UTC m=+3.269005186" lastFinishedPulling="2026-04-20 14:54:09.095891277 +0000 UTC m=+20.788434754" observedRunningTime="2026-04-20 14:54:10.994253777 +0000 UTC m=+22.686797276" watchObservedRunningTime="2026-04-20 14:54:10.994945311 +0000 UTC m=+22.687488813" Apr 20 14:54:11.802212 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:11.802101 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T14:54:10.866417899Z","UUID":"c51c703c-777f-464d-88d5-d10ed0c71291","Handler":null,"Name":"","Endpoint":""} Apr 20 14:54:11.803940 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:11.803909 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 14:54:11.803940 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:11.803939 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 14:54:12.863935 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:12.863730 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:12.864336 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:12.863790 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:12.864336 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:12.864084 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:12.864336 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:12.864127 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:12.986835 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:12.986803 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 14:54:12.987274 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:12.987247 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerStarted","Data":"208ef25b41e68f29a656005476181d009ec361f8946713993d30630498b22699"} Apr 20 14:54:12.989154 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:12.989125 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" event={"ID":"bd12a9c8-f89e-4b3f-9b54-c6cb0de15e8b","Type":"ContainerStarted","Data":"b33fa5732171160bbdb543a690910dc516bfb03ea8184776cbd586e58b60a03a"} Apr 20 14:54:13.006947 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:13.006892 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-b79nq" podStartSLOduration=4.570651025 podStartE2EDuration="25.006876592s" podCreationTimestamp="2026-04-20 14:53:48 +0000 UTC" firstStartedPulling="2026-04-20 14:53:51.576409997 +0000 UTC m=+3.268953478" lastFinishedPulling="2026-04-20 14:54:12.012635564 +0000 UTC m=+23.705179045" observedRunningTime="2026-04-20 14:54:13.006384839 +0000 UTC m=+24.698928340" watchObservedRunningTime="2026-04-20 14:54:13.006876592 +0000 UTC m=+24.699420092" Apr 20 14:54:13.421099 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:13.421061 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:54:13.421806 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:13.421780 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:54:13.991255 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:13.991215 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:54:13.991766 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:13.991554 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vc6rt" Apr 20 14:54:14.864369 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:14.864198 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:14.864515 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:14.864220 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:14.864515 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:14.864439 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:14.864597 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:14.864573 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:14.993995 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:14.993963 2575 generic.go:358] "Generic (PLEG): container finished" podID="4735b84c-6b45-45bb-8802-627a40d45e62" containerID="c61aed16eddce1be4b1d8672942173488ba21236fc638da6528314af6fe7ad64" exitCode=0 Apr 20 14:54:14.994692 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:14.994054 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" event={"ID":"4735b84c-6b45-45bb-8802-627a40d45e62","Type":"ContainerDied","Data":"c61aed16eddce1be4b1d8672942173488ba21236fc638da6528314af6fe7ad64"} Apr 20 14:54:14.996994 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:14.996971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 14:54:14.997326 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:14.997305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerStarted","Data":"42520329bdb4562d166f5608974b1086cbc145f940c23b6ccc1663ceec579ae8"} Apr 20 14:54:14.997789 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:14.997773 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:54:14.997925 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:14.997913 2575 scope.go:117] "RemoveContainer" containerID="58365d6f353b273d7a072b73da607cf5cd0e6af5c1de5d16f2b4b85e76157f5a" Apr 20 14:54:15.013863 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:15.013841 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:54:16.001730 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.001641 2575 generic.go:358] "Generic (PLEG): container finished" podID="4735b84c-6b45-45bb-8802-627a40d45e62" containerID="5ab42d5601d1ec5a9be7f452c7dea0fe473c7832d7a60ba0d414a6179f36e59f" exitCode=0 Apr 20 14:54:16.002148 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.001721 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" event={"ID":"4735b84c-6b45-45bb-8802-627a40d45e62","Type":"ContainerDied","Data":"5ab42d5601d1ec5a9be7f452c7dea0fe473c7832d7a60ba0d414a6179f36e59f"} Apr 20 14:54:16.005487 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.005463 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 14:54:16.005844 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.005814 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" event={"ID":"a1a5ae57-d482-4ab1-94d0-99811e6761ea","Type":"ContainerStarted","Data":"28ecd2276531d5e1ba2f9a501e97b58f4b2dbec1ea82e365586868456a0f1727"} Apr 20 14:54:16.006198 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.006181 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:54:16.006269 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.006207 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:54:16.021037 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.021001 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:54:16.051102 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.051063 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" podStartSLOduration=10.401400928 podStartE2EDuration="28.05104911s" podCreationTimestamp="2026-04-20 14:53:48 +0000 UTC" firstStartedPulling="2026-04-20 14:53:51.549140113 +0000 UTC m=+3.241683603" lastFinishedPulling="2026-04-20 14:54:09.198788294 +0000 UTC m=+20.891331785" observedRunningTime="2026-04-20 14:54:16.050602311 +0000 UTC m=+27.743146120" watchObservedRunningTime="2026-04-20 14:54:16.05104911 +0000 UTC m=+27.743592609" Apr 20 14:54:16.353191 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.353149 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-548cc"] Apr 20 14:54:16.353348 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.353285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:16.353407 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:16.353387 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:16.353723 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.353668 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gpcl9"] Apr 20 14:54:16.353852 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:16.353796 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:16.353900 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:16.353875 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:17.009278 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:17.009194 2575 generic.go:358] "Generic (PLEG): container finished" podID="4735b84c-6b45-45bb-8802-627a40d45e62" containerID="b5ec71f2eaa438edb3e255d508eb2214813c48eb310c4981188bfad0c6937b47" exitCode=0 Apr 20 14:54:17.009616 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:17.009284 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" event={"ID":"4735b84c-6b45-45bb-8802-627a40d45e62","Type":"ContainerDied","Data":"b5ec71f2eaa438edb3e255d508eb2214813c48eb310c4981188bfad0c6937b47"} Apr 20 14:54:17.864859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:17.864822 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:17.864859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:17.864876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:17.865149 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:17.864997 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:17.865149 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:17.865099 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:19.864774 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:19.864740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:19.865154 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:19.864740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:19.865154 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:19.864888 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:19.865154 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:19.864915 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:21.864796 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:21.864752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:21.865509 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:21.864893 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:54:21.865509 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:21.864980 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:21.865509 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:21.865100 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-548cc" podUID="94a902fd-00c5-4c9a-867d-96a0e32a66c1" Apr 20 14:54:22.156080 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.155973 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-93.ec2.internal" event="NodeReady" Apr 20 14:54:22.156246 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.156125 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 14:54:22.201265 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.201232 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-w22hl"] Apr 20 14:54:22.208254 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.208216 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8swcv"] Apr 20 14:54:22.208395 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.208369 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:22.210800 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.210763 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 14:54:22.210931 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.210846 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 14:54:22.210931 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.210871 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 14:54:22.211077 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.210951 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jp5zp\"" Apr 20 14:54:22.211436 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.211404 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.214035 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.213843 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w22hl"] Apr 20 14:54:22.214035 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.213883 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rc7ph\"" Apr 20 14:54:22.214035 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.213886 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 14:54:22.214035 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.213936 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 14:54:22.228891 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.228868 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8swcv"] Apr 20 14:54:22.283324 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.283295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:22.283324 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.283334 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.283543 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.283359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p89q\" (UniqueName: \"kubernetes.io/projected/0e64e671-ff76-45fc-b205-a75b74329230-kube-api-access-8p89q\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:22.283543 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.283441 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e19d0653-6009-45aa-a269-a68af8375182-tmp-dir\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.283543 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.283466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrfbz\" (UniqueName: \"kubernetes.io/projected/e19d0653-6009-45aa-a269-a68af8375182-kube-api-access-nrfbz\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.283543 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.283491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e19d0653-6009-45aa-a269-a68af8375182-config-volume\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.384731 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.384688 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e19d0653-6009-45aa-a269-a68af8375182-tmp-dir\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.384907 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.384750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrfbz\" (UniqueName: \"kubernetes.io/projected/e19d0653-6009-45aa-a269-a68af8375182-kube-api-access-nrfbz\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.384907 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.384799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e19d0653-6009-45aa-a269-a68af8375182-config-volume\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.384907 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.384829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:22.384907 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.384862 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.384907 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.384892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8p89q\" (UniqueName: \"kubernetes.io/projected/0e64e671-ff76-45fc-b205-a75b74329230-kube-api-access-8p89q\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:22.385336 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.384939 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:22.385336 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.385012 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert podName:0e64e671-ff76-45fc-b205-a75b74329230 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:22.88499197 +0000 UTC m=+34.577535447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert") pod "ingress-canary-w22hl" (UID: "0e64e671-ff76-45fc-b205-a75b74329230") : secret "canary-serving-cert" not found Apr 20 14:54:22.385336 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.385097 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e19d0653-6009-45aa-a269-a68af8375182-tmp-dir\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.385336 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.385136 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:22.385336 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.385186 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls podName:e19d0653-6009-45aa-a269-a68af8375182 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:22.885169233 +0000 UTC m=+34.577712714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls") pod "dns-default-8swcv" (UID: "e19d0653-6009-45aa-a269-a68af8375182") : secret "dns-default-metrics-tls" not found Apr 20 14:54:22.385511 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.385338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e19d0653-6009-45aa-a269-a68af8375182-config-volume\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.398445 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.398421 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrfbz\" (UniqueName: \"kubernetes.io/projected/e19d0653-6009-45aa-a269-a68af8375182-kube-api-access-nrfbz\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.412709 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.412653 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p89q\" (UniqueName: \"kubernetes.io/projected/0e64e671-ff76-45fc-b205-a75b74329230-kube-api-access-8p89q\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:22.586333 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.586286 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:22.586519 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.586433 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:54:22.586519 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.586502 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs podName:ae2439f5-03aa-43b8-9466-c01fbcb53912 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:54.586484245 +0000 UTC m=+66.279027726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs") pod "network-metrics-daemon-gpcl9" (UID: "ae2439f5-03aa-43b8-9466-c01fbcb53912") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:54:22.687485 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.687406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:22.687620 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.687560 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:54:22.687620 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.687584 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:54:22.687620 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.687594 2575 projected.go:194] Error preparing data for projected volume kube-api-access-4gxs8 for pod openshift-network-diagnostics/network-check-target-548cc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:54:22.687713 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.687643 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8 podName:94a902fd-00c5-4c9a-867d-96a0e32a66c1 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:54.687629764 +0000 UTC m=+66.380173245 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-4gxs8" (UniqueName: "kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8") pod "network-check-target-548cc" (UID: "94a902fd-00c5-4c9a-867d-96a0e32a66c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:54:22.888825 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.888790 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:22.888825 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:22.888833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:22.889477 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.888936 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:22.889477 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.888943 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:22.889477 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.888997 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls podName:e19d0653-6009-45aa-a269-a68af8375182 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:23.888979636 +0000 UTC m=+35.581523112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls") pod "dns-default-8swcv" (UID: "e19d0653-6009-45aa-a269-a68af8375182") : secret "dns-default-metrics-tls" not found Apr 20 14:54:22.889477 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:22.889012 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert podName:0e64e671-ff76-45fc-b205-a75b74329230 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:23.889004645 +0000 UTC m=+35.581548122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert") pod "ingress-canary-w22hl" (UID: "0e64e671-ff76-45fc-b205-a75b74329230") : secret "canary-serving-cert" not found Apr 20 14:54:23.864541 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:23.864513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:23.864692 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:23.864513 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:23.868712 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:23.868680 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:54:23.868712 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:23.868683 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:54:23.868866 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:23.868725 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:54:23.868866 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:23.868682 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ns2f2\"" Apr 20 14:54:23.868866 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:23.868690 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-np6cl\"" Apr 20 14:54:23.896753 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:23.896736 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:23.897061 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:23.896765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:23.897061 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:23.896867 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:23.897061 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:23.896867 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:23.897061 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:23.896924 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert podName:0e64e671-ff76-45fc-b205-a75b74329230 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:25.896907524 +0000 UTC m=+37.589451000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert") pod "ingress-canary-w22hl" (UID: "0e64e671-ff76-45fc-b205-a75b74329230") : secret "canary-serving-cert" not found Apr 20 14:54:23.897061 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:23.896938 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls podName:e19d0653-6009-45aa-a269-a68af8375182 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:25.896931956 +0000 UTC m=+37.589475433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls") pod "dns-default-8swcv" (UID: "e19d0653-6009-45aa-a269-a68af8375182") : secret "dns-default-metrics-tls" not found Apr 20 14:54:24.026122 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:24.026083 2575 generic.go:358] "Generic (PLEG): container finished" podID="4735b84c-6b45-45bb-8802-627a40d45e62" containerID="4f1a8b816f7f9ecfeae93bcbdc6292bc2473737040950cd964f7f7c18514662f" exitCode=0 Apr 20 14:54:24.026299 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:24.026140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" event={"ID":"4735b84c-6b45-45bb-8802-627a40d45e62","Type":"ContainerDied","Data":"4f1a8b816f7f9ecfeae93bcbdc6292bc2473737040950cd964f7f7c18514662f"} Apr 20 14:54:25.030603 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:25.030576 2575 generic.go:358] "Generic (PLEG): container finished" podID="4735b84c-6b45-45bb-8802-627a40d45e62" containerID="ec8a2252c85486436752795efebff2d387a70e0c6af6e0fb25f7ad74bd1aa726" exitCode=0 Apr 20 14:54:25.030951 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:25.030630 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" event={"ID":"4735b84c-6b45-45bb-8802-627a40d45e62","Type":"ContainerDied","Data":"ec8a2252c85486436752795efebff2d387a70e0c6af6e0fb25f7ad74bd1aa726"} Apr 20 14:54:25.910836 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:25.910653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:25.910987 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:25.910845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:25.910987 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:25.910789 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:25.910987 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:25.910947 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:25.910987 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:25.910968 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert podName:0e64e671-ff76-45fc-b205-a75b74329230 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:29.910947552 +0000 UTC m=+41.603491042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert") pod "ingress-canary-w22hl" (UID: "0e64e671-ff76-45fc-b205-a75b74329230") : secret "canary-serving-cert" not found Apr 20 14:54:25.910987 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:25.910986 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls podName:e19d0653-6009-45aa-a269-a68af8375182 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:29.910975626 +0000 UTC m=+41.603519102 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls") pod "dns-default-8swcv" (UID: "e19d0653-6009-45aa-a269-a68af8375182") : secret "dns-default-metrics-tls" not found Apr 20 14:54:26.035680 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:26.035638 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" event={"ID":"4735b84c-6b45-45bb-8802-627a40d45e62","Type":"ContainerStarted","Data":"c5ab2bab9f741cba49a584b79155aa6cbe849ecb8dfe11daeab844a7192d2f2e"} Apr 20 14:54:26.056493 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:26.056450 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p9ckk" podStartSLOduration=6.490518568 podStartE2EDuration="38.056434536s" podCreationTimestamp="2026-04-20 14:53:48 +0000 UTC" firstStartedPulling="2026-04-20 14:53:51.576577631 +0000 UTC m=+3.269121115" lastFinishedPulling="2026-04-20 14:54:23.142493591 +0000 UTC m=+34.835037083" observedRunningTime="2026-04-20 14:54:26.055254913 +0000 UTC m=+37.747798409" watchObservedRunningTime="2026-04-20 14:54:26.056434536 +0000 UTC m=+37.748978036" Apr 20 14:54:29.936560 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:29.936500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:29.936560 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:29.936556 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:29.936976 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:29.936654 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:29.936976 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:29.936668 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:29.936976 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:29.936717 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert podName:0e64e671-ff76-45fc-b205-a75b74329230 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:37.936702402 +0000 UTC m=+49.629245879 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert") pod "ingress-canary-w22hl" (UID: "0e64e671-ff76-45fc-b205-a75b74329230") : secret "canary-serving-cert" not found Apr 20 14:54:29.936976 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:29.936730 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls podName:e19d0653-6009-45aa-a269-a68af8375182 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:37.936724823 +0000 UTC m=+49.629268300 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls") pod "dns-default-8swcv" (UID: "e19d0653-6009-45aa-a269-a68af8375182") : secret "dns-default-metrics-tls" not found Apr 20 14:54:37.993808 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:37.993765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:37.993808 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:37.993808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:37.994331 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:37.993906 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:37.994331 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:37.993910 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:37.994331 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:37.993955 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls podName:e19d0653-6009-45aa-a269-a68af8375182 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:53.993943084 +0000 UTC m=+65.686486561 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls") pod "dns-default-8swcv" (UID: "e19d0653-6009-45aa-a269-a68af8375182") : secret "dns-default-metrics-tls" not found Apr 20 14:54:37.994331 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:37.993977 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert podName:0e64e671-ff76-45fc-b205-a75b74329230 nodeName:}" failed. No retries permitted until 2026-04-20 14:54:53.99396278 +0000 UTC m=+65.686506257 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert") pod "ingress-canary-w22hl" (UID: "0e64e671-ff76-45fc-b205-a75b74329230") : secret "canary-serving-cert" not found Apr 20 14:54:48.022268 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:48.022235 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4d2sl" Apr 20 14:54:54.010264 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.010226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:54:54.010264 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.010269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:54:54.010764 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:54.010371 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:54:54.010764 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:54.010375 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:54:54.010764 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:54.010435 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert podName:0e64e671-ff76-45fc-b205-a75b74329230 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:26.010420548 +0000 UTC m=+97.702964025 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert") pod "ingress-canary-w22hl" (UID: "0e64e671-ff76-45fc-b205-a75b74329230") : secret "canary-serving-cert" not found Apr 20 14:54:54.010764 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:54.010448 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls podName:e19d0653-6009-45aa-a269-a68af8375182 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:26.010442564 +0000 UTC m=+97.702986041 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls") pod "dns-default-8swcv" (UID: "e19d0653-6009-45aa-a269-a68af8375182") : secret "dns-default-metrics-tls" not found Apr 20 14:54:54.614272 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.614240 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:54:54.616703 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.616687 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:54:54.624645 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:54.624625 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:54:54.624690 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:54:54.624684 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs podName:ae2439f5-03aa-43b8-9466-c01fbcb53912 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:58.624669038 +0000 UTC m=+130.317212515 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs") pod "network-metrics-daemon-gpcl9" (UID: "ae2439f5-03aa-43b8-9466-c01fbcb53912") : secret "metrics-daemon-secret" not found Apr 20 14:54:54.714945 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.714911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:54.718091 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.718074 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:54:54.728410 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.728396 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:54:54.739756 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.739736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gxs8\" (UniqueName: \"kubernetes.io/projected/94a902fd-00c5-4c9a-867d-96a0e32a66c1-kube-api-access-4gxs8\") pod \"network-check-target-548cc\" (UID: \"94a902fd-00c5-4c9a-867d-96a0e32a66c1\") " pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:54.780975 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.780950 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-np6cl\"" Apr 20 14:54:54.788962 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.788945 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:54.927658 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:54.927633 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-548cc"] Apr 20 14:54:54.930896 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:54:54.930868 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a902fd_00c5_4c9a_867d_96a0e32a66c1.slice/crio-c2ac140176bd2abf00066856205e7e5da8a51df22d4b1c34096086db86d4ce7a WatchSource:0}: Error finding container c2ac140176bd2abf00066856205e7e5da8a51df22d4b1c34096086db86d4ce7a: Status 404 returned error can't find the container with id c2ac140176bd2abf00066856205e7e5da8a51df22d4b1c34096086db86d4ce7a Apr 20 14:54:55.089337 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:55.089301 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-548cc" event={"ID":"94a902fd-00c5-4c9a-867d-96a0e32a66c1","Type":"ContainerStarted","Data":"c2ac140176bd2abf00066856205e7e5da8a51df22d4b1c34096086db86d4ce7a"} Apr 20 14:54:58.096700 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:58.096665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-548cc" event={"ID":"94a902fd-00c5-4c9a-867d-96a0e32a66c1","Type":"ContainerStarted","Data":"b2f798dadac80436dcac5b1a566fde511d8cef6a6ac779c4f74ce72e9801b79b"} Apr 20 14:54:58.097166 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:58.096812 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:54:58.111689 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:54:58.111619 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-548cc" podStartSLOduration=66.457948147 podStartE2EDuration="1m9.111606804s" podCreationTimestamp="2026-04-20 14:53:49 +0000 UTC" firstStartedPulling="2026-04-20 14:54:54.932692233 +0000 UTC m=+66.625235714" lastFinishedPulling="2026-04-20 14:54:57.58635087 +0000 UTC m=+69.278894371" observedRunningTime="2026-04-20 14:54:58.111296704 +0000 UTC m=+69.803840215" watchObservedRunningTime="2026-04-20 14:54:58.111606804 +0000 UTC m=+69.804150333" Apr 20 14:55:26.038577 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:26.038538 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:55:26.038577 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:26.038581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:55:26.039010 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:26.038674 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:26.039010 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:26.038682 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:26.039010 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:26.038735 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls podName:e19d0653-6009-45aa-a269-a68af8375182 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:30.038718306 +0000 UTC m=+161.731261784 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls") pod "dns-default-8swcv" (UID: "e19d0653-6009-45aa-a269-a68af8375182") : secret "dns-default-metrics-tls" not found Apr 20 14:55:26.039010 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:26.038749 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert podName:0e64e671-ff76-45fc-b205-a75b74329230 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:30.038743384 +0000 UTC m=+161.731286861 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert") pod "ingress-canary-w22hl" (UID: "0e64e671-ff76-45fc-b205-a75b74329230") : secret "canary-serving-cert" not found Apr 20 14:55:29.101301 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:29.101270 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-548cc" Apr 20 14:55:49.049339 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.049308 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7rn4l"] Apr 20 14:55:49.056192 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.056165 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr"] Apr 20 14:55:49.056351 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.056332 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.058675 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.058651 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 14:55:49.058675 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.058664 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 14:55:49.059449 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.059431 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.059682 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.059564 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 14:55:49.059682 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.059587 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-f9dqw\"" Apr 20 14:55:49.059943 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.059928 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 14:55:49.062134 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.062113 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 14:55:49.062591 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.062418 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 14:55:49.062591 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.062506 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-qktjh\"" Apr 20 14:55:49.062591 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.062513 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 14:55:49.064037 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.063491 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 14:55:49.065122 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.065101 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 14:55:49.065242 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.065224 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7rn4l"] Apr 20 14:55:49.067576 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.067557 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr"] Apr 20 14:55:49.145350 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.145321 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7b6cfdf44-qtvqj"] Apr 20 14:55:49.148535 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.148520 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.151327 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.151288 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 14:55:49.151327 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.151300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 14:55:49.151327 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.151297 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 14:55:49.151564 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.151336 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 14:55:49.151564 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.151345 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 14:55:49.151564 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.151300 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 14:55:49.151564 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.151306 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-s5g4c\"" Apr 20 14:55:49.157284 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.157260 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b6cfdf44-qtvqj"] Apr 20 14:55:49.195415 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.195392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34f2a619-5bb8-4702-ae7e-217e448429bc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.195516 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.195421 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87t5r\" (UniqueName: \"kubernetes.io/projected/34f2a619-5bb8-4702-ae7e-217e448429bc-kube-api-access-87t5r\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.195516 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.195439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr8gc\" (UniqueName: \"kubernetes.io/projected/abfd5746-0891-4e15-9237-6631a29b8009-kube-api-access-fr8gc\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.195516 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.195468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.195516 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.195507 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/34f2a619-5bb8-4702-ae7e-217e448429bc-snapshots\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.195644 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.195529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34f2a619-5bb8-4702-ae7e-217e448429bc-service-ca-bundle\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.195644 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.195571 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34f2a619-5bb8-4702-ae7e-217e448429bc-tmp\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.195644 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.195586 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f2a619-5bb8-4702-ae7e-217e448429bc-serving-cert\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.195644 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.195620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/abfd5746-0891-4e15-9237-6631a29b8009-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.296072 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/abfd5746-0891-4e15-9237-6631a29b8009-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.296165 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296132 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34f2a619-5bb8-4702-ae7e-217e448429bc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.296200 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fhfr\" (UniqueName: \"kubernetes.io/projected/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-kube-api-access-5fhfr\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.296242 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296198 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87t5r\" (UniqueName: \"kubernetes.io/projected/34f2a619-5bb8-4702-ae7e-217e448429bc-kube-api-access-87t5r\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.296242 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296224 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr8gc\" (UniqueName: \"kubernetes.io/projected/abfd5746-0891-4e15-9237-6631a29b8009-kube-api-access-fr8gc\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.296338 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.296338 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.296338 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296314 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/34f2a619-5bb8-4702-ae7e-217e448429bc-snapshots\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.296470 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.296358 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:49.296470 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34f2a619-5bb8-4702-ae7e-217e448429bc-service-ca-bundle\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.296470 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.296430 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls podName:abfd5746-0891-4e15-9237-6631a29b8009 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:49.796408287 +0000 UTC m=+121.488951766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bp8pr" (UID: "abfd5746-0891-4e15-9237-6631a29b8009") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:49.296470 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296462 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.296663 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296511 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-default-certificate\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.296663 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34f2a619-5bb8-4702-ae7e-217e448429bc-tmp\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.296663 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f2a619-5bb8-4702-ae7e-217e448429bc-serving-cert\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.296663 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296625 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-stats-auth\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.296866 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296848 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/abfd5746-0891-4e15-9237-6631a29b8009-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.296909 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34f2a619-5bb8-4702-ae7e-217e448429bc-tmp\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.296909 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34f2a619-5bb8-4702-ae7e-217e448429bc-service-ca-bundle\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.296984 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.296933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/34f2a619-5bb8-4702-ae7e-217e448429bc-snapshots\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.297140 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.297123 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34f2a619-5bb8-4702-ae7e-217e448429bc-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.299075 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.299050 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/34f2a619-5bb8-4702-ae7e-217e448429bc-serving-cert\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.309339 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.309287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87t5r\" (UniqueName: \"kubernetes.io/projected/34f2a619-5bb8-4702-ae7e-217e448429bc-kube-api-access-87t5r\") pod \"insights-operator-585dfdc468-7rn4l\" (UID: \"34f2a619-5bb8-4702-ae7e-217e448429bc\") " pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.309508 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.309490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr8gc\" (UniqueName: \"kubernetes.io/projected/abfd5746-0891-4e15-9237-6631a29b8009-kube-api-access-fr8gc\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.368486 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.368455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-7rn4l" Apr 20 14:55:49.397922 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.397898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5fhfr\" (UniqueName: \"kubernetes.io/projected/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-kube-api-access-5fhfr\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.398037 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.397942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.398037 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.397971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.398037 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.398010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-default-certificate\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.398160 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.398075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-stats-auth\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.398160 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.398118 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:55:49.398160 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.398131 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:55:49.89810999 +0000 UTC m=+121.590653468 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : configmap references non-existent config key: service-ca.crt Apr 20 14:55:49.398305 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.398174 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:55:49.898161197 +0000 UTC m=+121.590704683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : secret "router-metrics-certs-default" not found Apr 20 14:55:49.400418 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.400395 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-default-certificate\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.400602 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.400577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-stats-auth\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.406555 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.406532 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fhfr\" (UniqueName: \"kubernetes.io/projected/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-kube-api-access-5fhfr\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.501545 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.501516 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7rn4l"] Apr 20 14:55:49.504378 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:55:49.504349 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34f2a619_5bb8_4702_ae7e_217e448429bc.slice/crio-e8a0a59dc2de3c19f9508b8f5a345e6c35f49863c973a4cce67bba57b103f7cd WatchSource:0}: Error finding container e8a0a59dc2de3c19f9508b8f5a345e6c35f49863c973a4cce67bba57b103f7cd: Status 404 returned error can't find the container with id e8a0a59dc2de3c19f9508b8f5a345e6c35f49863c973a4cce67bba57b103f7cd Apr 20 14:55:49.801112 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.801082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:49.801263 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.801207 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:49.801311 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.801268 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls podName:abfd5746-0891-4e15-9237-6631a29b8009 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:50.801252336 +0000 UTC m=+122.493795815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bp8pr" (UID: "abfd5746-0891-4e15-9237-6631a29b8009") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:49.901822 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.901783 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.901822 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:49.901828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:49.902015 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.901961 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:55:50.901943503 +0000 UTC m=+122.594486998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : configmap references non-existent config key: service-ca.crt Apr 20 14:55:49.902015 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.901968 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:55:49.902128 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:49.902013 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:55:50.902000437 +0000 UTC m=+122.594543915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : secret "router-metrics-certs-default" not found Apr 20 14:55:50.195998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:50.195910 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7rn4l" event={"ID":"34f2a619-5bb8-4702-ae7e-217e448429bc","Type":"ContainerStarted","Data":"e8a0a59dc2de3c19f9508b8f5a345e6c35f49863c973a4cce67bba57b103f7cd"} Apr 20 14:55:50.808097 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:50.808054 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:50.808287 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:50.808228 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:50.808361 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:50.808310 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls podName:abfd5746-0891-4e15-9237-6631a29b8009 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:52.80828829 +0000 UTC m=+124.500831770 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bp8pr" (UID: "abfd5746-0891-4e15-9237-6631a29b8009") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:50.908797 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:50.908755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:50.908797 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:50.908807 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:50.909076 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:50.908937 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:55:52.908914007 +0000 UTC m=+124.601457492 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : configmap references non-existent config key: service-ca.crt Apr 20 14:55:50.909076 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:50.908984 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:55:50.909191 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:50.909079 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:55:52.909061465 +0000 UTC m=+124.601604951 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : secret "router-metrics-certs-default" not found Apr 20 14:55:52.201218 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:52.201173 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7rn4l" event={"ID":"34f2a619-5bb8-4702-ae7e-217e448429bc","Type":"ContainerStarted","Data":"9f4484cde0bf271de562b1b10f7231b49bc53bda087aea8c4aaea7962d0dbacf"} Apr 20 14:55:52.216456 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:52.216340 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-7rn4l" podStartSLOduration=0.670787818 podStartE2EDuration="3.216326063s" podCreationTimestamp="2026-04-20 14:55:49 +0000 UTC" firstStartedPulling="2026-04-20 14:55:49.506240142 +0000 UTC m=+121.198783618" lastFinishedPulling="2026-04-20 14:55:52.051778363 +0000 UTC m=+123.744321863" observedRunningTime="2026-04-20 14:55:52.215662295 +0000 UTC m=+123.908205794" watchObservedRunningTime="2026-04-20 14:55:52.216326063 +0000 UTC m=+123.908869566" Apr 20 14:55:52.820971 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:52.820936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:52.821157 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:52.821106 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:52.821200 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:52.821167 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls podName:abfd5746-0891-4e15-9237-6631a29b8009 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:56.821151489 +0000 UTC m=+128.513694969 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bp8pr" (UID: "abfd5746-0891-4e15-9237-6631a29b8009") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:52.921823 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:52.921791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:52.921823 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:52.921824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:52.922048 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:52.921926 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:55:52.922048 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:52.921960 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:55:56.921943052 +0000 UTC m=+128.614486529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : configmap references non-existent config key: service-ca.crt Apr 20 14:55:52.922048 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:52.921981 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:55:56.92197397 +0000 UTC m=+128.614517446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : secret "router-metrics-certs-default" not found Apr 20 14:55:53.974496 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:53.974463 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd"] Apr 20 14:55:53.977395 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:53.977378 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd" Apr 20 14:55:53.979859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:53.979831 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:53.980801 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:53.980783 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:53.980895 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:53.980789 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-5f7c8\"" Apr 20 14:55:53.986340 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:53.986318 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd"] Apr 20 14:55:54.130426 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:54.130389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wktb\" (UniqueName: \"kubernetes.io/projected/10750e47-7592-4544-9e16-62ee13fcf036-kube-api-access-9wktb\") pod \"volume-data-source-validator-7c6cbb6c87-82wgd\" (UID: \"10750e47-7592-4544-9e16-62ee13fcf036\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd" Apr 20 14:55:54.231195 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:54.231111 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wktb\" (UniqueName: \"kubernetes.io/projected/10750e47-7592-4544-9e16-62ee13fcf036-kube-api-access-9wktb\") pod \"volume-data-source-validator-7c6cbb6c87-82wgd\" (UID: \"10750e47-7592-4544-9e16-62ee13fcf036\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd" Apr 20 14:55:54.239341 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:54.239313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wktb\" (UniqueName: \"kubernetes.io/projected/10750e47-7592-4544-9e16-62ee13fcf036-kube-api-access-9wktb\") pod \"volume-data-source-validator-7c6cbb6c87-82wgd\" (UID: \"10750e47-7592-4544-9e16-62ee13fcf036\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd" Apr 20 14:55:54.286205 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:54.286175 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd" Apr 20 14:55:54.397409 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:54.397380 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd"] Apr 20 14:55:54.400451 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:55:54.400425 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10750e47_7592_4544_9e16_62ee13fcf036.slice/crio-7243e58b73d6abcf573418f996b0d31db3dde718958ec3124db97cb6f70f3898 WatchSource:0}: Error finding container 7243e58b73d6abcf573418f996b0d31db3dde718958ec3124db97cb6f70f3898: Status 404 returned error can't find the container with id 7243e58b73d6abcf573418f996b0d31db3dde718958ec3124db97cb6f70f3898 Apr 20 14:55:54.894779 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:54.894743 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2z95t_3004f37c-e216-4320-82a5-11c7c7fe8be1/dns-node-resolver/0.log" Apr 20 14:55:55.208417 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:55.208337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd" event={"ID":"10750e47-7592-4544-9e16-62ee13fcf036","Type":"ContainerStarted","Data":"7243e58b73d6abcf573418f996b0d31db3dde718958ec3124db97cb6f70f3898"} Apr 20 14:55:55.893801 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:55.893773 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nwk2q_e674f45e-6036-47da-a806-c40040927fba/node-ca/0.log" Apr 20 14:55:56.210997 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:56.210915 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd" event={"ID":"10750e47-7592-4544-9e16-62ee13fcf036","Type":"ContainerStarted","Data":"66d0269f94a7ec5f12c6d69629ee75ab8e455e44306643d361fb92603b7df153"} Apr 20 14:55:56.235533 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:56.235491 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-82wgd" podStartSLOduration=1.776555063 podStartE2EDuration="3.235478256s" podCreationTimestamp="2026-04-20 14:55:53 +0000 UTC" firstStartedPulling="2026-04-20 14:55:54.402356329 +0000 UTC m=+126.094899810" lastFinishedPulling="2026-04-20 14:55:55.861279513 +0000 UTC m=+127.553823003" observedRunningTime="2026-04-20 14:55:56.235068379 +0000 UTC m=+127.927611879" watchObservedRunningTime="2026-04-20 14:55:56.235478256 +0000 UTC m=+127.928021755" Apr 20 14:55:56.853062 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:56.852996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:55:56.853254 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:56.853142 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:56.853254 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:56.853206 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls podName:abfd5746-0891-4e15-9237-6631a29b8009 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.853192 +0000 UTC m=+136.545735478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bp8pr" (UID: "abfd5746-0891-4e15-9237-6631a29b8009") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:56.954338 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:56.954306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:56.954338 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:56.954342 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:55:56.954568 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:56.954432 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:55:56.954568 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:56.954451 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.954435696 +0000 UTC m=+136.646979174 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : configmap references non-existent config key: service-ca.crt Apr 20 14:55:56.954675 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:56.954587 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.95456544 +0000 UTC m=+136.647108918 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : secret "router-metrics-certs-default" not found Apr 20 14:55:57.084590 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.084561 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b6cbd5b9-fw9hv"] Apr 20 14:55:57.087157 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.087137 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.099956 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.099934 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b6cbd5b9-fw9hv"] Apr 20 14:55:57.103183 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.103141 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 14:55:57.103183 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.103160 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 14:55:57.109996 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.109974 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rrj9j\"" Apr 20 14:55:57.109996 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.109996 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 14:55:57.143558 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.143526 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 14:55:57.156272 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.156246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-bound-sa-token\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.156388 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.156283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-image-registry-private-configuration\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.156388 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.156312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.156388 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.156362 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ced62a51-e841-46d2-b901-6bf468781d83-ca-trust-extracted\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.156388 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.156386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlr95\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-kube-api-access-mlr95\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.156525 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.156459 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-registry-certificates\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.156525 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.156475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-trusted-ca\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.156525 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.156516 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-installation-pull-secrets\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.257527 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.257499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-registry-certificates\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.257527 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.257532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-trusted-ca\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.257917 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.257578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-installation-pull-secrets\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.257917 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.257643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-bound-sa-token\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.257917 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.257672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-image-registry-private-configuration\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.257917 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.257695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.257917 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.257718 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ced62a51-e841-46d2-b901-6bf468781d83-ca-trust-extracted\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.257917 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.257743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlr95\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-kube-api-access-mlr95\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.257917 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:57.257830 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:57.257917 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:57.257848 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b6cbd5b9-fw9hv: secret "image-registry-tls" not found Apr 20 14:55:57.257917 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:57.257903 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls podName:ced62a51-e841-46d2-b901-6bf468781d83 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.757884183 +0000 UTC m=+129.450427660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls") pod "image-registry-b6cbd5b9-fw9hv" (UID: "ced62a51-e841-46d2-b901-6bf468781d83") : secret "image-registry-tls" not found Apr 20 14:55:57.258231 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.258172 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-registry-certificates\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.258231 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.258169 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ced62a51-e841-46d2-b901-6bf468781d83-ca-trust-extracted\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.258660 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.258642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-trusted-ca\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.260645 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.260625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-installation-pull-secrets\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.260731 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.260687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-image-registry-private-configuration\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.281401 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.281375 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-bound-sa-token\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.281601 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.281584 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlr95\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-kube-api-access-mlr95\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.762484 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:57.762449 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:57.762626 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:57.762551 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:57.762626 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:57.762563 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b6cbd5b9-fw9hv: secret "image-registry-tls" not found Apr 20 14:55:57.762626 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:57.762621 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls podName:ced62a51-e841-46d2-b901-6bf468781d83 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:58.762606275 +0000 UTC m=+130.455149751 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls") pod "image-registry-b6cbd5b9-fw9hv" (UID: "ced62a51-e841-46d2-b901-6bf468781d83") : secret "image-registry-tls" not found Apr 20 14:55:58.668716 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:58.668681 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:55:58.669120 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:58.668827 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:55:58.669120 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:58.668893 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs podName:ae2439f5-03aa-43b8-9466-c01fbcb53912 nodeName:}" failed. No retries permitted until 2026-04-20 14:58:00.668878006 +0000 UTC m=+252.361421483 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs") pod "network-metrics-daemon-gpcl9" (UID: "ae2439f5-03aa-43b8-9466-c01fbcb53912") : secret "metrics-daemon-secret" not found Apr 20 14:55:58.769803 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:55:58.769769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:55:58.769942 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:58.769904 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:58.769942 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:58.769920 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b6cbd5b9-fw9hv: secret "image-registry-tls" not found Apr 20 14:55:58.770011 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:55:58.769971 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls podName:ced62a51-e841-46d2-b901-6bf468781d83 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:00.769956735 +0000 UTC m=+132.462500211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls") pod "image-registry-b6cbd5b9-fw9hv" (UID: "ced62a51-e841-46d2-b901-6bf468781d83") : secret "image-registry-tls" not found Apr 20 14:56:00.784454 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:00.784403 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:56:00.784879 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:00.784554 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:56:00.784879 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:00.784577 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b6cbd5b9-fw9hv: secret "image-registry-tls" not found Apr 20 14:56:00.784879 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:00.784633 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls podName:ced62a51-e841-46d2-b901-6bf468781d83 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.784617367 +0000 UTC m=+136.477160844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls") pod "image-registry-b6cbd5b9-fw9hv" (UID: "ced62a51-e841-46d2-b901-6bf468781d83") : secret "image-registry-tls" not found Apr 20 14:56:00.960569 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:00.960540 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm"] Apr 20 14:56:00.963145 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:00.963132 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm" Apr 20 14:56:00.971410 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:00.971386 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm"] Apr 20 14:56:00.974752 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:00.974737 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mrtp8\"" Apr 20 14:56:01.087559 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:01.087499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8785\" (UniqueName: \"kubernetes.io/projected/6e15808a-d43e-44e1-9df4-bda8a66093bc-kube-api-access-f8785\") pod \"network-check-source-8894fc9bd-fmqbm\" (UID: \"6e15808a-d43e-44e1-9df4-bda8a66093bc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm" Apr 20 14:56:01.188460 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:01.188435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8785\" (UniqueName: \"kubernetes.io/projected/6e15808a-d43e-44e1-9df4-bda8a66093bc-kube-api-access-f8785\") pod \"network-check-source-8894fc9bd-fmqbm\" (UID: \"6e15808a-d43e-44e1-9df4-bda8a66093bc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm" Apr 20 14:56:01.211544 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:01.211520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8785\" (UniqueName: \"kubernetes.io/projected/6e15808a-d43e-44e1-9df4-bda8a66093bc-kube-api-access-f8785\") pod \"network-check-source-8894fc9bd-fmqbm\" (UID: \"6e15808a-d43e-44e1-9df4-bda8a66093bc\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm" Apr 20 14:56:01.272798 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:01.272777 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm" Apr 20 14:56:01.385943 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:01.385906 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e15808a_d43e_44e1_9df4_bda8a66093bc.slice/crio-75b1a53470afe1c546e06b5c06260ce4efd8966eca37cd29249484a034c58501 WatchSource:0}: Error finding container 75b1a53470afe1c546e06b5c06260ce4efd8966eca37cd29249484a034c58501: Status 404 returned error can't find the container with id 75b1a53470afe1c546e06b5c06260ce4efd8966eca37cd29249484a034c58501 Apr 20 14:56:01.392390 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:01.392372 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm"] Apr 20 14:56:02.223074 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.223042 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm" event={"ID":"6e15808a-d43e-44e1-9df4-bda8a66093bc","Type":"ContainerStarted","Data":"b95e0c46da5a0dbb737c5fb0da295d4377a1cbdbe2a443d5e547890df59cbcef"} Apr 20 14:56:02.223526 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.223085 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm" event={"ID":"6e15808a-d43e-44e1-9df4-bda8a66093bc","Type":"ContainerStarted","Data":"75b1a53470afe1c546e06b5c06260ce4efd8966eca37cd29249484a034c58501"} Apr 20 14:56:02.262103 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.262059 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-fmqbm" podStartSLOduration=2.262044983 podStartE2EDuration="2.262044983s" podCreationTimestamp="2026-04-20 14:56:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:02.261866494 +0000 UTC m=+133.954409993" watchObservedRunningTime="2026-04-20 14:56:02.262044983 +0000 UTC m=+133.954588479" Apr 20 14:56:02.365539 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.365508 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv"] Apr 20 14:56:02.368328 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.368313 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv" Apr 20 14:56:02.377978 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.377953 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv"] Apr 20 14:56:02.388218 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.388195 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 14:56:02.388305 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.388195 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-bbd2c\"" Apr 20 14:56:02.388305 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.388231 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 14:56:02.499187 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.499113 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nddzw\" (UniqueName: \"kubernetes.io/projected/83a12d4d-86d3-4c54-a107-3a8275bd04db-kube-api-access-nddzw\") pod \"migrator-74bb7799d9-f49kv\" (UID: \"83a12d4d-86d3-4c54-a107-3a8275bd04db\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv" Apr 20 14:56:02.600005 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.599974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nddzw\" (UniqueName: \"kubernetes.io/projected/83a12d4d-86d3-4c54-a107-3a8275bd04db-kube-api-access-nddzw\") pod \"migrator-74bb7799d9-f49kv\" (UID: \"83a12d4d-86d3-4c54-a107-3a8275bd04db\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv" Apr 20 14:56:02.622579 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.622547 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nddzw\" (UniqueName: \"kubernetes.io/projected/83a12d4d-86d3-4c54-a107-3a8275bd04db-kube-api-access-nddzw\") pod \"migrator-74bb7799d9-f49kv\" (UID: \"83a12d4d-86d3-4c54-a107-3a8275bd04db\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv" Apr 20 14:56:02.677762 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.677727 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv" Apr 20 14:56:02.792119 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:02.792070 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a12d4d_86d3_4c54_a107_3a8275bd04db.slice/crio-e9ae3cd563953c63d2b827cd49dd30374d89764a91036502d7ba0791fd208557 WatchSource:0}: Error finding container e9ae3cd563953c63d2b827cd49dd30374d89764a91036502d7ba0791fd208557: Status 404 returned error can't find the container with id e9ae3cd563953c63d2b827cd49dd30374d89764a91036502d7ba0791fd208557 Apr 20 14:56:02.792505 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:02.792486 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv"] Apr 20 14:56:03.225802 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:03.225722 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv" event={"ID":"83a12d4d-86d3-4c54-a107-3a8275bd04db","Type":"ContainerStarted","Data":"e9ae3cd563953c63d2b827cd49dd30374d89764a91036502d7ba0791fd208557"} Apr 20 14:56:04.817979 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:04.817939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:56:04.818404 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:04.818113 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:56:04.818404 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:04.818131 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b6cbd5b9-fw9hv: secret "image-registry-tls" not found Apr 20 14:56:04.818404 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:04.818186 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls podName:ced62a51-e841-46d2-b901-6bf468781d83 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.8181704 +0000 UTC m=+144.510713876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls") pod "image-registry-b6cbd5b9-fw9hv" (UID: "ced62a51-e841-46d2-b901-6bf468781d83") : secret "image-registry-tls" not found Apr 20 14:56:04.918699 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:04.918667 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:56:04.918877 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:04.918837 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:56:04.918941 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:04.918917 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls podName:abfd5746-0891-4e15-9237-6631a29b8009 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:20.918896903 +0000 UTC m=+152.611440399 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-bp8pr" (UID: "abfd5746-0891-4e15-9237-6631a29b8009") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:56:05.020067 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:05.020036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:05.020263 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:05.020110 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:05.020263 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:05.020142 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:56:21.020123158 +0000 UTC m=+152.712666638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : configmap references non-existent config key: service-ca.crt Apr 20 14:56:05.020263 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:05.020202 2575 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:56:05.020263 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:05.020255 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs podName:ec2d9386-fc11-40ea-b4da-f2e1aa8d435e nodeName:}" failed. No retries permitted until 2026-04-20 14:56:21.020240699 +0000 UTC m=+152.712784190 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs") pod "router-default-7b6cfdf44-qtvqj" (UID: "ec2d9386-fc11-40ea-b4da-f2e1aa8d435e") : secret "router-metrics-certs-default" not found Apr 20 14:56:05.233436 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:05.233349 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv" event={"ID":"83a12d4d-86d3-4c54-a107-3a8275bd04db","Type":"ContainerStarted","Data":"27d117d68d0c949b3a2c1305c03b3a19388792ac764c8dba04f7735134c86108"} Apr 20 14:56:05.233436 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:05.233389 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv" event={"ID":"83a12d4d-86d3-4c54-a107-3a8275bd04db","Type":"ContainerStarted","Data":"349807cd0e570ead8c39e9d1be04664fccfa8f103d7735a2022ff3662ca5c759"} Apr 20 14:56:05.258383 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:05.258338 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f49kv" podStartSLOduration=1.7935080719999998 podStartE2EDuration="3.258319755s" podCreationTimestamp="2026-04-20 14:56:02 +0000 UTC" firstStartedPulling="2026-04-20 14:56:02.793854244 +0000 UTC m=+134.486397721" lastFinishedPulling="2026-04-20 14:56:04.258665924 +0000 UTC m=+135.951209404" observedRunningTime="2026-04-20 14:56:05.257187901 +0000 UTC m=+136.949731401" watchObservedRunningTime="2026-04-20 14:56:05.258319755 +0000 UTC m=+136.950863259" Apr 20 14:56:12.887096 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:12.887063 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:56:12.889552 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:12.889527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") pod \"image-registry-b6cbd5b9-fw9hv\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:56:12.995820 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:12.995787 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:56:13.118767 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:13.118735 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b6cbd5b9-fw9hv"] Apr 20 14:56:13.121482 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:13.121453 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podced62a51_e841_46d2_b901_6bf468781d83.slice/crio-9ea390edf7e3ac89e8ffa21ddac341ffdd030bcd4f2dcf47fa3dc11d1a35e115 WatchSource:0}: Error finding container 9ea390edf7e3ac89e8ffa21ddac341ffdd030bcd4f2dcf47fa3dc11d1a35e115: Status 404 returned error can't find the container with id 9ea390edf7e3ac89e8ffa21ddac341ffdd030bcd4f2dcf47fa3dc11d1a35e115 Apr 20 14:56:13.253878 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:13.253842 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" event={"ID":"ced62a51-e841-46d2-b901-6bf468781d83","Type":"ContainerStarted","Data":"88a7a0c1664e24c0ebad907f50055fb2da587c85cae9f8510e8c6296dcda76e2"} Apr 20 14:56:13.253878 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:13.253879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" event={"ID":"ced62a51-e841-46d2-b901-6bf468781d83","Type":"ContainerStarted","Data":"9ea390edf7e3ac89e8ffa21ddac341ffdd030bcd4f2dcf47fa3dc11d1a35e115"} Apr 20 14:56:13.254074 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:13.253962 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:56:13.271318 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:13.271271 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" podStartSLOduration=16.271256691 podStartE2EDuration="16.271256691s" podCreationTimestamp="2026-04-20 14:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:13.270740119 +0000 UTC m=+144.963283618" watchObservedRunningTime="2026-04-20 14:56:13.271256691 +0000 UTC m=+144.963800190" Apr 20 14:56:20.948753 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:20.948694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:56:20.951209 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:20.951188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/abfd5746-0891-4e15-9237-6631a29b8009-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-bp8pr\" (UID: \"abfd5746-0891-4e15-9237-6631a29b8009\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:56:21.049502 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:21.049467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:21.049701 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:21.049559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:21.050121 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:21.050103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-service-ca-bundle\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:21.052012 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:21.051994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec2d9386-fc11-40ea-b4da-f2e1aa8d435e-metrics-certs\") pod \"router-default-7b6cfdf44-qtvqj\" (UID: \"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e\") " pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:21.173357 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:21.173321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" Apr 20 14:56:21.257490 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:21.257459 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:21.295210 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:21.295007 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr"] Apr 20 14:56:21.297728 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:21.297701 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabfd5746_0891_4e15_9237_6631a29b8009.slice/crio-d59ba74547c9ffc0899d3d4d511aed8edee771cdddb604c6b5253828d8bca7ef WatchSource:0}: Error finding container d59ba74547c9ffc0899d3d4d511aed8edee771cdddb604c6b5253828d8bca7ef: Status 404 returned error can't find the container with id d59ba74547c9ffc0899d3d4d511aed8edee771cdddb604c6b5253828d8bca7ef Apr 20 14:56:21.376548 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:21.376517 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7b6cfdf44-qtvqj"] Apr 20 14:56:21.379476 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:21.379448 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2d9386_fc11_40ea_b4da_f2e1aa8d435e.slice/crio-1b382d872182036995831ca9fb3508bd42ba114f9c8c4516e4885d6af7c5afde WatchSource:0}: Error finding container 1b382d872182036995831ca9fb3508bd42ba114f9c8c4516e4885d6af7c5afde: Status 404 returned error can't find the container with id 1b382d872182036995831ca9fb3508bd42ba114f9c8c4516e4885d6af7c5afde Apr 20 14:56:22.275843 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:22.275807 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" event={"ID":"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e","Type":"ContainerStarted","Data":"f1d128eb5b9596f499e628e51599166b2cf8fe265ff4f36a0f2f56d3255f43c8"} Apr 20 14:56:22.275843 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:22.275843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" event={"ID":"ec2d9386-fc11-40ea-b4da-f2e1aa8d435e","Type":"ContainerStarted","Data":"1b382d872182036995831ca9fb3508bd42ba114f9c8c4516e4885d6af7c5afde"} Apr 20 14:56:22.276913 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:22.276884 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" event={"ID":"abfd5746-0891-4e15-9237-6631a29b8009","Type":"ContainerStarted","Data":"d59ba74547c9ffc0899d3d4d511aed8edee771cdddb604c6b5253828d8bca7ef"} Apr 20 14:56:22.295962 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:22.295910 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" podStartSLOduration=33.295898094 podStartE2EDuration="33.295898094s" podCreationTimestamp="2026-04-20 14:55:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:22.294609607 +0000 UTC m=+153.987153120" watchObservedRunningTime="2026-04-20 14:56:22.295898094 +0000 UTC m=+153.988441592" Apr 20 14:56:23.257978 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:23.257903 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:23.260647 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:23.260626 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:23.280696 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:23.280665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" event={"ID":"abfd5746-0891-4e15-9237-6631a29b8009","Type":"ContainerStarted","Data":"6f9086e8cff0f3d9de2690cff9548827189a30f4d3370edb4b706e41c6640e38"} Apr 20 14:56:23.281105 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:23.280919 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:23.281966 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:23.281945 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7b6cfdf44-qtvqj" Apr 20 14:56:23.318377 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:23.318329 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-bp8pr" podStartSLOduration=32.647976728 podStartE2EDuration="34.318310611s" podCreationTimestamp="2026-04-20 14:55:49 +0000 UTC" firstStartedPulling="2026-04-20 14:56:21.299433872 +0000 UTC m=+152.991977352" lastFinishedPulling="2026-04-20 14:56:22.969767754 +0000 UTC m=+154.662311235" observedRunningTime="2026-04-20 14:56:23.317136058 +0000 UTC m=+155.009679557" watchObservedRunningTime="2026-04-20 14:56:23.318310611 +0000 UTC m=+155.010854111" Apr 20 14:56:25.163702 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.163669 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-s2sjx"] Apr 20 14:56:25.167200 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.167178 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.169910 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.169883 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 14:56:25.170956 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.170935 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 14:56:25.171099 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.170936 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-z4lp2\"" Apr 20 14:56:25.181584 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.181564 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s2sjx"] Apr 20 14:56:25.183212 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.183187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/433b9f99-bb71-468d-9a18-83234a426f09-crio-socket\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.183332 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.183260 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/433b9f99-bb71-468d-9a18-83234a426f09-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.183332 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.183295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/433b9f99-bb71-468d-9a18-83234a426f09-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.183437 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.183340 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/433b9f99-bb71-468d-9a18-83234a426f09-data-volume\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.183488 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.183444 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9q46\" (UniqueName: \"kubernetes.io/projected/433b9f99-bb71-468d-9a18-83234a426f09-kube-api-access-x9q46\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.202767 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.202746 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b6cbd5b9-fw9hv"] Apr 20 14:56:25.214563 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.214538 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bdb7cb467-cqn7c"] Apr 20 14:56:25.219322 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.219305 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.224071 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:25.224041 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-w22hl" podUID="0e64e671-ff76-45fc-b205-a75b74329230" Apr 20 14:56:25.229189 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.229167 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bdb7cb467-cqn7c"] Apr 20 14:56:25.229673 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:25.229649 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8swcv" podUID="e19d0653-6009-45aa-a269-a68af8375182" Apr 20 14:56:25.284126 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/433b9f99-bb71-468d-9a18-83234a426f09-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.284260 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284137 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/433b9f99-bb71-468d-9a18-83234a426f09-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.284260 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fc25d12-330f-41bf-82d8-7158f40715bf-installation-pull-secrets\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.284260 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284209 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/433b9f99-bb71-468d-9a18-83234a426f09-data-volume\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.284383 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fc25d12-330f-41bf-82d8-7158f40715bf-bound-sa-token\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.284383 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284353 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fc25d12-330f-41bf-82d8-7158f40715bf-trusted-ca\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.284471 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fc25d12-330f-41bf-82d8-7158f40715bf-ca-trust-extracted\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.284525 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284495 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdt2s\" (UniqueName: \"kubernetes.io/projected/2fc25d12-330f-41bf-82d8-7158f40715bf-kube-api-access-gdt2s\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.284525 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284517 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/433b9f99-bb71-468d-9a18-83234a426f09-data-volume\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.284625 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9q46\" (UniqueName: \"kubernetes.io/projected/433b9f99-bb71-468d-9a18-83234a426f09-kube-api-access-x9q46\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.284625 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284594 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/433b9f99-bb71-468d-9a18-83234a426f09-crio-socket\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.284710 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284657 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2fc25d12-330f-41bf-82d8-7158f40715bf-image-registry-private-configuration\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.284710 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fc25d12-330f-41bf-82d8-7158f40715bf-registry-certificates\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.284806 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fc25d12-330f-41bf-82d8-7158f40715bf-registry-tls\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.284806 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/433b9f99-bb71-468d-9a18-83234a426f09-crio-socket\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.284806 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.284742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/433b9f99-bb71-468d-9a18-83234a426f09-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.285945 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.285930 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8swcv" Apr 20 14:56:25.285999 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.285961 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:56:25.286864 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.286846 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/433b9f99-bb71-468d-9a18-83234a426f09-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.300573 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.300556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9q46\" (UniqueName: \"kubernetes.io/projected/433b9f99-bb71-468d-9a18-83234a426f09-kube-api-access-x9q46\") pod \"insights-runtime-extractor-s2sjx\" (UID: \"433b9f99-bb71-468d-9a18-83234a426f09\") " pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.385632 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.385598 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2fc25d12-330f-41bf-82d8-7158f40715bf-image-registry-private-configuration\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.385771 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.385647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fc25d12-330f-41bf-82d8-7158f40715bf-registry-certificates\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.385771 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.385666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fc25d12-330f-41bf-82d8-7158f40715bf-registry-tls\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.385771 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.385695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fc25d12-330f-41bf-82d8-7158f40715bf-installation-pull-secrets\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.385918 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.385857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fc25d12-330f-41bf-82d8-7158f40715bf-bound-sa-token\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.385918 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.385892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fc25d12-330f-41bf-82d8-7158f40715bf-trusted-ca\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.386055 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.385927 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fc25d12-330f-41bf-82d8-7158f40715bf-ca-trust-extracted\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.386204 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.386181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdt2s\" (UniqueName: \"kubernetes.io/projected/2fc25d12-330f-41bf-82d8-7158f40715bf-kube-api-access-gdt2s\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.386430 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.386408 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2fc25d12-330f-41bf-82d8-7158f40715bf-ca-trust-extracted\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.386924 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.386900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2fc25d12-330f-41bf-82d8-7158f40715bf-registry-certificates\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.387216 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.387193 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2fc25d12-330f-41bf-82d8-7158f40715bf-trusted-ca\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.388550 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.388522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2fc25d12-330f-41bf-82d8-7158f40715bf-image-registry-private-configuration\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.388637 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.388569 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2fc25d12-330f-41bf-82d8-7158f40715bf-registry-tls\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.388810 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.388793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2fc25d12-330f-41bf-82d8-7158f40715bf-installation-pull-secrets\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.394892 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.394871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2fc25d12-330f-41bf-82d8-7158f40715bf-bound-sa-token\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.395081 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.395063 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdt2s\" (UniqueName: \"kubernetes.io/projected/2fc25d12-330f-41bf-82d8-7158f40715bf-kube-api-access-gdt2s\") pod \"image-registry-7bdb7cb467-cqn7c\" (UID: \"2fc25d12-330f-41bf-82d8-7158f40715bf\") " pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.476210 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.476144 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s2sjx" Apr 20 14:56:25.528428 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.528402 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:25.598496 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.598468 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s2sjx"] Apr 20 14:56:25.601870 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:25.601838 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod433b9f99_bb71_468d_9a18_83234a426f09.slice/crio-1ab4d6af4d7dc6ccd2d3d0a66be10697f2916c437063972a72aed15c13994d59 WatchSource:0}: Error finding container 1ab4d6af4d7dc6ccd2d3d0a66be10697f2916c437063972a72aed15c13994d59: Status 404 returned error can't find the container with id 1ab4d6af4d7dc6ccd2d3d0a66be10697f2916c437063972a72aed15c13994d59 Apr 20 14:56:25.650372 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:25.650234 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bdb7cb467-cqn7c"] Apr 20 14:56:25.656117 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:25.656082 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fc25d12_330f_41bf_82d8_7158f40715bf.slice/crio-6e6695c8e31b49699db5496ac127dde08ee57171a3cfbb6224cd81cc6642bb73 WatchSource:0}: Error finding container 6e6695c8e31b49699db5496ac127dde08ee57171a3cfbb6224cd81cc6642bb73: Status 404 returned error can't find the container with id 6e6695c8e31b49699db5496ac127dde08ee57171a3cfbb6224cd81cc6642bb73 Apr 20 14:56:26.290219 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:26.290181 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s2sjx" event={"ID":"433b9f99-bb71-468d-9a18-83234a426f09","Type":"ContainerStarted","Data":"24cb795694e65590d0b3920f7792b7a04603c26540e05c799fba81b4dd92da42"} Apr 20 14:56:26.290219 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:26.290221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s2sjx" event={"ID":"433b9f99-bb71-468d-9a18-83234a426f09","Type":"ContainerStarted","Data":"613180b27abe094d4a87bc0a69e1d45543c4eb4a65cc875e19d7785dd943eb84"} Apr 20 14:56:26.290592 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:26.290234 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s2sjx" event={"ID":"433b9f99-bb71-468d-9a18-83234a426f09","Type":"ContainerStarted","Data":"1ab4d6af4d7dc6ccd2d3d0a66be10697f2916c437063972a72aed15c13994d59"} Apr 20 14:56:26.291494 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:26.291472 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" event={"ID":"2fc25d12-330f-41bf-82d8-7158f40715bf","Type":"ContainerStarted","Data":"374964e1722cd33dbf218eab28a7984bcac8c857e98f62bc2092d6d41fb0c3f3"} Apr 20 14:56:26.291494 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:26.291499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" event={"ID":"2fc25d12-330f-41bf-82d8-7158f40715bf","Type":"ContainerStarted","Data":"6e6695c8e31b49699db5496ac127dde08ee57171a3cfbb6224cd81cc6642bb73"} Apr 20 14:56:26.291668 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:26.291587 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:26.311101 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:26.311063 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" podStartSLOduration=1.311052296 podStartE2EDuration="1.311052296s" podCreationTimestamp="2026-04-20 14:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:26.309521917 +0000 UTC m=+158.002065416" watchObservedRunningTime="2026-04-20 14:56:26.311052296 +0000 UTC m=+158.003595789" Apr 20 14:56:26.874164 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:26.874121 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gpcl9" podUID="ae2439f5-03aa-43b8-9466-c01fbcb53912" Apr 20 14:56:28.300063 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:28.300002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s2sjx" event={"ID":"433b9f99-bb71-468d-9a18-83234a426f09","Type":"ContainerStarted","Data":"e6ee4b5e8a58de2995c6ae6d25ac0bfcbb60f5d6eec7eebcea1ee9a4bb0f48b8"} Apr 20 14:56:28.320921 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:28.320868 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-s2sjx" podStartSLOduration=1.455921276 podStartE2EDuration="3.320855452s" podCreationTimestamp="2026-04-20 14:56:25 +0000 UTC" firstStartedPulling="2026-04-20 14:56:25.676498757 +0000 UTC m=+157.369042235" lastFinishedPulling="2026-04-20 14:56:27.541432919 +0000 UTC m=+159.233976411" observedRunningTime="2026-04-20 14:56:28.319925801 +0000 UTC m=+160.012469301" watchObservedRunningTime="2026-04-20 14:56:28.320855452 +0000 UTC m=+160.013398952" Apr 20 14:56:30.119743 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.119705 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:56:30.120243 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.119771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:56:30.122201 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.122176 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e19d0653-6009-45aa-a269-a68af8375182-metrics-tls\") pod \"dns-default-8swcv\" (UID: \"e19d0653-6009-45aa-a269-a68af8375182\") " pod="openshift-dns/dns-default-8swcv" Apr 20 14:56:30.122298 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.122273 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0e64e671-ff76-45fc-b205-a75b74329230-cert\") pod \"ingress-canary-w22hl\" (UID: \"0e64e671-ff76-45fc-b205-a75b74329230\") " pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:56:30.388902 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.388823 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rc7ph\"" Apr 20 14:56:30.389709 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.389693 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jp5zp\"" Apr 20 14:56:30.397480 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.397466 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8swcv" Apr 20 14:56:30.397571 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.397553 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-w22hl" Apr 20 14:56:30.535156 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.535128 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8swcv"] Apr 20 14:56:30.538444 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:30.538414 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode19d0653_6009_45aa_a269_a68af8375182.slice/crio-0f941d8002c558914e6b1b72b4dc3f67b7878e34521a43dae9513d9eea61562b WatchSource:0}: Error finding container 0f941d8002c558914e6b1b72b4dc3f67b7878e34521a43dae9513d9eea61562b: Status 404 returned error can't find the container with id 0f941d8002c558914e6b1b72b4dc3f67b7878e34521a43dae9513d9eea61562b Apr 20 14:56:30.554286 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:30.554265 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-w22hl"] Apr 20 14:56:30.556807 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:30.556770 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e64e671_ff76_45fc_b205_a75b74329230.slice/crio-79a66fdb14de61db190d2970d6d505168141ab6539806a69b04a1b03bffc9516 WatchSource:0}: Error finding container 79a66fdb14de61db190d2970d6d505168141ab6539806a69b04a1b03bffc9516: Status 404 returned error can't find the container with id 79a66fdb14de61db190d2970d6d505168141ab6539806a69b04a1b03bffc9516 Apr 20 14:56:31.308983 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.308932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w22hl" event={"ID":"0e64e671-ff76-45fc-b205-a75b74329230","Type":"ContainerStarted","Data":"79a66fdb14de61db190d2970d6d505168141ab6539806a69b04a1b03bffc9516"} Apr 20 14:56:31.310230 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.310200 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8swcv" event={"ID":"e19d0653-6009-45aa-a269-a68af8375182","Type":"ContainerStarted","Data":"0f941d8002c558914e6b1b72b4dc3f67b7878e34521a43dae9513d9eea61562b"} Apr 20 14:56:31.866815 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.866786 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bmrzc"] Apr 20 14:56:31.870190 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.870168 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:31.873001 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.872963 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 14:56:31.873143 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.873063 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 14:56:31.873143 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.873080 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 14:56:31.873354 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.873338 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2xwxs\"" Apr 20 14:56:31.873936 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.873914 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 14:56:31.936823 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.936792 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9528386-7ec4-424d-aa07-1eca20561056-metrics-client-ca\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:31.936990 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.936837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-tls\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:31.936990 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.936941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b9528386-7ec4-424d-aa07-1eca20561056-root\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:31.937144 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.937046 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-accelerators-collector-config\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:31.937195 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.937135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-textfile\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:31.937195 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.937178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9528386-7ec4-424d-aa07-1eca20561056-sys\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:31.937287 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.937200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-wtmp\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:31.937287 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.937229 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmv9q\" (UniqueName: \"kubernetes.io/projected/b9528386-7ec4-424d-aa07-1eca20561056-kube-api-access-jmv9q\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:31.937287 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:31.937273 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.038578 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.038540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-accelerators-collector-config\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.038772 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.038597 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-textfile\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.038772 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.038626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9528386-7ec4-424d-aa07-1eca20561056-sys\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.038772 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.038648 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-wtmp\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.038772 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.038675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmv9q\" (UniqueName: \"kubernetes.io/projected/b9528386-7ec4-424d-aa07-1eca20561056-kube-api-access-jmv9q\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.038772 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.038740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b9528386-7ec4-424d-aa07-1eca20561056-sys\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.039049 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.038781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.039049 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.038914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9528386-7ec4-424d-aa07-1eca20561056-metrics-client-ca\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.039049 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.038950 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-tls\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.039049 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.039005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b9528386-7ec4-424d-aa07-1eca20561056-root\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.039260 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.039056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-textfile\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.039260 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.039150 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b9528386-7ec4-424d-aa07-1eca20561056-root\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.039260 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.039184 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-wtmp\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.039260 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:32.039189 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 14:56:32.039260 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.039243 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-accelerators-collector-config\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.039260 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:32.039261 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-tls podName:b9528386-7ec4-424d-aa07-1eca20561056 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:32.539239139 +0000 UTC m=+164.231782629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-tls") pod "node-exporter-bmrzc" (UID: "b9528386-7ec4-424d-aa07-1eca20561056") : secret "node-exporter-tls" not found Apr 20 14:56:32.040211 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.040183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9528386-7ec4-424d-aa07-1eca20561056-metrics-client-ca\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.041484 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.041463 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.050521 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.050497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmv9q\" (UniqueName: \"kubernetes.io/projected/b9528386-7ec4-424d-aa07-1eca20561056-kube-api-access-jmv9q\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.543725 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.543617 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-tls\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.546759 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.546730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b9528386-7ec4-424d-aa07-1eca20561056-node-exporter-tls\") pod \"node-exporter-bmrzc\" (UID: \"b9528386-7ec4-424d-aa07-1eca20561056\") " pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.780314 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.780287 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bmrzc" Apr 20 14:56:32.789479 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:32.789443 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9528386_7ec4_424d_aa07_1eca20561056.slice/crio-f65f84a78cd0909c6c8b16f6b1692b258d79b4daf31ec782e6fbfa1d7448c0f5 WatchSource:0}: Error finding container f65f84a78cd0909c6c8b16f6b1692b258d79b4daf31ec782e6fbfa1d7448c0f5: Status 404 returned error can't find the container with id f65f84a78cd0909c6c8b16f6b1692b258d79b4daf31ec782e6fbfa1d7448c0f5 Apr 20 14:56:32.895664 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.895632 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:56:32.900289 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.900273 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.903480 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.903454 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 14:56:32.903595 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.903550 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 14:56:32.903993 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.903969 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 14:56:32.904129 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.904007 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6v6dq\"" Apr 20 14:56:32.904214 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.904200 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 14:56:32.904470 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.904438 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 14:56:32.904470 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.904442 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 14:56:32.904788 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.904769 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 14:56:32.904871 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.904777 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 14:56:32.904871 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.904822 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 14:56:32.916530 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.916512 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:56:32.947455 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947571 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-out\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947571 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947571 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947555 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947707 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947607 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947707 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947638 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-web-config\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947707 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947677 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947707 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947699 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b2lk\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-kube-api-access-7b2lk\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947717 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-volume\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.947854 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947804 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.948007 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947888 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:32.948007 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:32.947920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.048830 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.048760 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.048830 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.048800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049043 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.048838 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-out\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049043 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.048853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049043 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.048871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049043 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.048903 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049043 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.048932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-web-config\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049043 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.048962 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049043 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.048982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b2lk\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-kube-api-access-7b2lk\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049043 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.049001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049043 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.049045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-volume\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049490 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.049074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049490 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.049118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.049490 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:56:33.049283 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-trusted-ca-bundle podName:d88c1951-f0da-4059-a1c4-e1ca624bee9b nodeName:}" failed. No retries permitted until 2026-04-20 14:56:33.549262466 +0000 UTC m=+165.241806154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b") : configmap references non-existent config key: ca-bundle.crt Apr 20 14:56:33.051083 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.050051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.051083 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.050696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.052321 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.052296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.052545 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.052524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-out\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.052814 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.052786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.052895 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.052874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.052955 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.052926 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.053836 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.053818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-web-config\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.053909 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.053883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.053946 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.053905 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-volume\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.053979 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.053944 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.057958 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.057931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b2lk\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-kube-api-access-7b2lk\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.320756 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.320646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8swcv" event={"ID":"e19d0653-6009-45aa-a269-a68af8375182","Type":"ContainerStarted","Data":"9a91b66184d960e67c8dbf0b9ae0d4d3e14f66f7861872c98a4a5c1a305ccf01"} Apr 20 14:56:33.320756 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.320690 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8swcv" event={"ID":"e19d0653-6009-45aa-a269-a68af8375182","Type":"ContainerStarted","Data":"33e657118e16642856881ddef1086b0379bb0bc7e8543c8c96dbf58107bd7310"} Apr 20 14:56:33.320973 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.320819 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8swcv" Apr 20 14:56:33.322380 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.322343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-w22hl" event={"ID":"0e64e671-ff76-45fc-b205-a75b74329230","Type":"ContainerStarted","Data":"18f2bc5e97a9f8597223233e41e4da458cbab879b6e0cec36aa49e531c48e8d7"} Apr 20 14:56:33.323552 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.323529 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bmrzc" event={"ID":"b9528386-7ec4-424d-aa07-1eca20561056","Type":"ContainerStarted","Data":"f65f84a78cd0909c6c8b16f6b1692b258d79b4daf31ec782e6fbfa1d7448c0f5"} Apr 20 14:56:33.335244 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.335200 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8swcv" podStartSLOduration=129.522759859 podStartE2EDuration="2m11.335189073s" podCreationTimestamp="2026-04-20 14:54:22 +0000 UTC" firstStartedPulling="2026-04-20 14:56:30.540419891 +0000 UTC m=+162.232963372" lastFinishedPulling="2026-04-20 14:56:32.352849103 +0000 UTC m=+164.045392586" observedRunningTime="2026-04-20 14:56:33.334568927 +0000 UTC m=+165.027112426" watchObservedRunningTime="2026-04-20 14:56:33.335189073 +0000 UTC m=+165.027732573" Apr 20 14:56:33.347682 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.347637 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-w22hl" podStartSLOduration=129.548901932 podStartE2EDuration="2m11.347625705s" podCreationTimestamp="2026-04-20 14:54:22 +0000 UTC" firstStartedPulling="2026-04-20 14:56:30.558572612 +0000 UTC m=+162.251116103" lastFinishedPulling="2026-04-20 14:56:32.357296396 +0000 UTC m=+164.049839876" observedRunningTime="2026-04-20 14:56:33.347407087 +0000 UTC m=+165.039950585" watchObservedRunningTime="2026-04-20 14:56:33.347625705 +0000 UTC m=+165.040169205" Apr 20 14:56:33.552805 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.552767 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.553703 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.553672 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.809892 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.809857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:33.951210 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:33.951185 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:56:33.952272 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:33.952242 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd88c1951_f0da_4059_a1c4_e1ca624bee9b.slice/crio-d1e20ba3eb622a6fedf9296e3c8c0cb961b9d4690869acbefcd41bff7cdab5f7 WatchSource:0}: Error finding container d1e20ba3eb622a6fedf9296e3c8c0cb961b9d4690869acbefcd41bff7cdab5f7: Status 404 returned error can't find the container with id d1e20ba3eb622a6fedf9296e3c8c0cb961b9d4690869acbefcd41bff7cdab5f7 Apr 20 14:56:34.327669 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.327633 2575 generic.go:358] "Generic (PLEG): container finished" podID="b9528386-7ec4-424d-aa07-1eca20561056" containerID="a198f19bc9b6d7e76d84d29f8189d8895bd751e925d2ac30957ef84ee4ec9f75" exitCode=0 Apr 20 14:56:34.327834 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.327704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bmrzc" event={"ID":"b9528386-7ec4-424d-aa07-1eca20561056","Type":"ContainerDied","Data":"a198f19bc9b6d7e76d84d29f8189d8895bd751e925d2ac30957ef84ee4ec9f75"} Apr 20 14:56:34.328796 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.328768 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerStarted","Data":"d1e20ba3eb622a6fedf9296e3c8c0cb961b9d4690869acbefcd41bff7cdab5f7"} Apr 20 14:56:34.940573 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.940544 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd"] Apr 20 14:56:34.944190 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.944169 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:34.947596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.947578 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 14:56:34.948284 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.948253 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 14:56:34.948441 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.948284 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-l6vi9rcmuf9h\"" Apr 20 14:56:34.948441 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.948284 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 14:56:34.948441 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.948325 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 14:56:34.948567 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.948521 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 14:56:34.948601 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.948569 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-cn82w\"" Apr 20 14:56:34.960877 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:34.960851 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd"] Apr 20 14:56:35.066362 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.066332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-tls\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.066524 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.066367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k8tz\" (UniqueName: \"kubernetes.io/projected/bfbf3447-579f-420e-b165-be48ea35efad-kube-api-access-8k8tz\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.066524 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.066406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-grpc-tls\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.066524 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.066450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.066524 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.066473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.066524 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.066499 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bfbf3447-579f-420e-b165-be48ea35efad-metrics-client-ca\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.066698 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.066572 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.066698 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.066594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.167708 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.167630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.167708 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.167668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.167708 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.167706 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-tls\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.167926 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.167725 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8k8tz\" (UniqueName: \"kubernetes.io/projected/bfbf3447-579f-420e-b165-be48ea35efad-kube-api-access-8k8tz\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.167926 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.167750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-grpc-tls\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.167926 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.167778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.167926 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.167800 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.167926 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.167820 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bfbf3447-579f-420e-b165-be48ea35efad-metrics-client-ca\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.169672 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.169644 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bfbf3447-579f-420e-b165-be48ea35efad-metrics-client-ca\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.171136 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.171106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.171234 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.171211 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.171387 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.171362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.171614 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.171481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-tls\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.171614 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.171500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-grpc-tls\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.171614 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.171520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bfbf3447-579f-420e-b165-be48ea35efad-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.175715 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.175697 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k8tz\" (UniqueName: \"kubernetes.io/projected/bfbf3447-579f-420e-b165-be48ea35efad-kube-api-access-8k8tz\") pod \"thanos-querier-5f5ccfc466-mqcfd\" (UID: \"bfbf3447-579f-420e-b165-be48ea35efad\") " pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.209501 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.209481 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:56:35.254041 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.253995 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:35.333878 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.333805 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bmrzc" event={"ID":"b9528386-7ec4-424d-aa07-1eca20561056","Type":"ContainerStarted","Data":"1accb932d1bf2b35e7b9e9cfcdfaded6df90993fdd913a2dbeb9626d25de6d28"} Apr 20 14:56:35.333878 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.333849 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bmrzc" event={"ID":"b9528386-7ec4-424d-aa07-1eca20561056","Type":"ContainerStarted","Data":"190c6346713b2490a702c8a8ad0dd56a455317cff138619a4ef337dd036ef88d"} Apr 20 14:56:35.335277 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.335252 2575 generic.go:358] "Generic (PLEG): container finished" podID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerID="29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0" exitCode=0 Apr 20 14:56:35.335381 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.335319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerDied","Data":"29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0"} Apr 20 14:56:35.352329 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.352284 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bmrzc" podStartSLOduration=3.470778389 podStartE2EDuration="4.352269078s" podCreationTimestamp="2026-04-20 14:56:31 +0000 UTC" firstStartedPulling="2026-04-20 14:56:32.791460893 +0000 UTC m=+164.484004373" lastFinishedPulling="2026-04-20 14:56:33.67295157 +0000 UTC m=+165.365495062" observedRunningTime="2026-04-20 14:56:35.350504447 +0000 UTC m=+167.043047945" watchObservedRunningTime="2026-04-20 14:56:35.352269078 +0000 UTC m=+167.044812634" Apr 20 14:56:35.379282 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:35.379225 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd"] Apr 20 14:56:35.381934 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:35.381909 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfbf3447_579f_420e_b165_be48ea35efad.slice/crio-1c6c8e57caafa32ca5b4e450bf81597b490e2f6f3a4c454e6d7698bebaa47fcc WatchSource:0}: Error finding container 1c6c8e57caafa32ca5b4e450bf81597b490e2f6f3a4c454e6d7698bebaa47fcc: Status 404 returned error can't find the container with id 1c6c8e57caafa32ca5b4e450bf81597b490e2f6f3a4c454e6d7698bebaa47fcc Apr 20 14:56:36.345508 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.345411 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" event={"ID":"bfbf3447-579f-420e-b165-be48ea35efad","Type":"ContainerStarted","Data":"1c6c8e57caafa32ca5b4e450bf81597b490e2f6f3a4c454e6d7698bebaa47fcc"} Apr 20 14:56:36.613310 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.613239 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf"] Apr 20 14:56:36.616829 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.616806 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" Apr 20 14:56:36.619394 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.619166 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 14:56:36.619394 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.619203 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-gr5fb\"" Apr 20 14:56:36.625304 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.625275 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf"] Apr 20 14:56:36.685153 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.685120 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f215683-8eed-4b7b-9022-2e52ec9b239d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wdcrf\" (UID: \"3f215683-8eed-4b7b-9022-2e52ec9b239d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" Apr 20 14:56:36.786089 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.786053 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f215683-8eed-4b7b-9022-2e52ec9b239d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wdcrf\" (UID: \"3f215683-8eed-4b7b-9022-2e52ec9b239d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" Apr 20 14:56:36.788820 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.788797 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/3f215683-8eed-4b7b-9022-2e52ec9b239d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wdcrf\" (UID: \"3f215683-8eed-4b7b-9022-2e52ec9b239d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" Apr 20 14:56:36.929417 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:36.929325 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" Apr 20 14:56:37.345102 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:37.345071 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf"] Apr 20 14:56:37.352422 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:37.352367 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerStarted","Data":"bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1"} Apr 20 14:56:37.354620 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:37.354594 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" event={"ID":"bfbf3447-579f-420e-b165-be48ea35efad","Type":"ContainerStarted","Data":"7fcf6ad3a82f3bfb21a861004dd29989205b6d55d8c4c519c792a96b6f82175a"} Apr 20 14:56:38.360313 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:38.360276 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerStarted","Data":"061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1"} Apr 20 14:56:38.360313 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:38.360314 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerStarted","Data":"c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666"} Apr 20 14:56:38.360802 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:38.360327 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerStarted","Data":"1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895"} Apr 20 14:56:38.360802 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:38.360338 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerStarted","Data":"cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70"} Apr 20 14:56:38.361529 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:38.361489 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" event={"ID":"3f215683-8eed-4b7b-9022-2e52ec9b239d","Type":"ContainerStarted","Data":"ff518920dc7ca1bc3718785b9e883911a19bf4e0d81005ac2edb253e3089c4ad"} Apr 20 14:56:38.363523 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:38.363466 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" event={"ID":"bfbf3447-579f-420e-b165-be48ea35efad","Type":"ContainerStarted","Data":"2fd0cd0a3bc6affb01796fe5a1c6c19cb410277cb9792bdaa17d243ea33dd648"} Apr 20 14:56:38.363523 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:38.363501 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" event={"ID":"bfbf3447-579f-420e-b165-be48ea35efad","Type":"ContainerStarted","Data":"79ed4b8a8bf1d7111d16b651948ea16ed4b793545cdfb1812f66a2873daac80a"} Apr 20 14:56:39.371776 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.371738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerStarted","Data":"326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b"} Apr 20 14:56:39.373214 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.373182 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" event={"ID":"3f215683-8eed-4b7b-9022-2e52ec9b239d","Type":"ContainerStarted","Data":"2d124bd33b40bb77158c0ef81d7fadc220b739a21b26bb4dc4d993ad54952f5f"} Apr 20 14:56:39.373408 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.373394 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" Apr 20 14:56:39.375895 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.375868 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" event={"ID":"bfbf3447-579f-420e-b165-be48ea35efad","Type":"ContainerStarted","Data":"eec18a8056aa1dea900f1e78b5f0471fa114ba2675ce87f375f71a5209720a95"} Apr 20 14:56:39.376055 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.375899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" event={"ID":"bfbf3447-579f-420e-b165-be48ea35efad","Type":"ContainerStarted","Data":"30735e486b199ce4f262a53c87894c7be11c5174f72cbc9d52d27ea140cdc35b"} Apr 20 14:56:39.376055 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.375912 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" event={"ID":"bfbf3447-579f-420e-b165-be48ea35efad","Type":"ContainerStarted","Data":"048c86e80f86bc58467fea2a953a7c721d7b95753b9971819d69a148c543304e"} Apr 20 14:56:39.376167 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.376089 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:39.378603 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.378581 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" Apr 20 14:56:39.399127 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.399080 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.696823174 podStartE2EDuration="7.399068219s" podCreationTimestamp="2026-04-20 14:56:32 +0000 UTC" firstStartedPulling="2026-04-20 14:56:33.954197208 +0000 UTC m=+165.646740687" lastFinishedPulling="2026-04-20 14:56:38.65644224 +0000 UTC m=+170.348985732" observedRunningTime="2026-04-20 14:56:39.397129784 +0000 UTC m=+171.089673283" watchObservedRunningTime="2026-04-20 14:56:39.399068219 +0000 UTC m=+171.091611717" Apr 20 14:56:39.418778 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.418727 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" podStartSLOduration=2.14516793 podStartE2EDuration="5.418714333s" podCreationTimestamp="2026-04-20 14:56:34 +0000 UTC" firstStartedPulling="2026-04-20 14:56:35.383688419 +0000 UTC m=+167.076231899" lastFinishedPulling="2026-04-20 14:56:38.657234809 +0000 UTC m=+170.349778302" observedRunningTime="2026-04-20 14:56:39.416418595 +0000 UTC m=+171.108962087" watchObservedRunningTime="2026-04-20 14:56:39.418714333 +0000 UTC m=+171.111257834" Apr 20 14:56:39.431498 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:39.431449 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wdcrf" podStartSLOduration=2.126467608 podStartE2EDuration="3.43143645s" podCreationTimestamp="2026-04-20 14:56:36 +0000 UTC" firstStartedPulling="2026-04-20 14:56:37.351436504 +0000 UTC m=+169.043979982" lastFinishedPulling="2026-04-20 14:56:38.656405347 +0000 UTC m=+170.348948824" observedRunningTime="2026-04-20 14:56:39.430465395 +0000 UTC m=+171.123008894" watchObservedRunningTime="2026-04-20 14:56:39.43143645 +0000 UTC m=+171.123979949" Apr 20 14:56:40.864634 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:40.864598 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:56:43.331126 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:43.331099 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8swcv" Apr 20 14:56:45.385429 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:45.385403 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5f5ccfc466-mqcfd" Apr 20 14:56:47.300912 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:47.300881 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7bdb7cb467-cqn7c" Apr 20 14:56:49.763085 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:49.763043 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-v5stc"] Apr 20 14:56:49.767499 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:49.767483 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-v5stc" Apr 20 14:56:49.769869 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:49.769840 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-jm78w\"" Apr 20 14:56:49.770187 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:49.770168 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 14:56:49.770277 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:49.770169 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 14:56:49.776107 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:49.776086 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-v5stc"] Apr 20 14:56:49.903287 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:49.903248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kzm8\" (UniqueName: \"kubernetes.io/projected/4ccad5fe-a610-4440-ad4f-3cba3e926719-kube-api-access-2kzm8\") pod \"downloads-6bcc868b7-v5stc\" (UID: \"4ccad5fe-a610-4440-ad4f-3cba3e926719\") " pod="openshift-console/downloads-6bcc868b7-v5stc" Apr 20 14:56:50.004310 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.004284 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kzm8\" (UniqueName: \"kubernetes.io/projected/4ccad5fe-a610-4440-ad4f-3cba3e926719-kube-api-access-2kzm8\") pod \"downloads-6bcc868b7-v5stc\" (UID: \"4ccad5fe-a610-4440-ad4f-3cba3e926719\") " pod="openshift-console/downloads-6bcc868b7-v5stc" Apr 20 14:56:50.012126 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.012102 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kzm8\" (UniqueName: \"kubernetes.io/projected/4ccad5fe-a610-4440-ad4f-3cba3e926719-kube-api-access-2kzm8\") pod \"downloads-6bcc868b7-v5stc\" (UID: \"4ccad5fe-a610-4440-ad4f-3cba3e926719\") " pod="openshift-console/downloads-6bcc868b7-v5stc" Apr 20 14:56:50.077567 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.077537 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-v5stc" Apr 20 14:56:50.194398 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.194348 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-v5stc"] Apr 20 14:56:50.197192 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:56:50.197166 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ccad5fe_a610_4440_ad4f_3cba3e926719.slice/crio-e7bf5189eb7855bb5c1477b0728f5350635a138936fd30b998e14a9404f5bfb4 WatchSource:0}: Error finding container e7bf5189eb7855bb5c1477b0728f5350635a138936fd30b998e14a9404f5bfb4: Status 404 returned error can't find the container with id e7bf5189eb7855bb5c1477b0728f5350635a138936fd30b998e14a9404f5bfb4 Apr 20 14:56:50.222985 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.222930 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" podUID="ced62a51-e841-46d2-b901-6bf468781d83" containerName="registry" containerID="cri-o://88a7a0c1664e24c0ebad907f50055fb2da587c85cae9f8510e8c6296dcda76e2" gracePeriod=30 Apr 20 14:56:50.412582 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.412553 2575 generic.go:358] "Generic (PLEG): container finished" podID="ced62a51-e841-46d2-b901-6bf468781d83" containerID="88a7a0c1664e24c0ebad907f50055fb2da587c85cae9f8510e8c6296dcda76e2" exitCode=0 Apr 20 14:56:50.412724 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.412587 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" event={"ID":"ced62a51-e841-46d2-b901-6bf468781d83","Type":"ContainerDied","Data":"88a7a0c1664e24c0ebad907f50055fb2da587c85cae9f8510e8c6296dcda76e2"} Apr 20 14:56:50.413727 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.413691 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-v5stc" event={"ID":"4ccad5fe-a610-4440-ad4f-3cba3e926719","Type":"ContainerStarted","Data":"e7bf5189eb7855bb5c1477b0728f5350635a138936fd30b998e14a9404f5bfb4"} Apr 20 14:56:50.448355 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.448335 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:56:50.611257 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611151 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlr95\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-kube-api-access-mlr95\") pod \"ced62a51-e841-46d2-b901-6bf468781d83\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " Apr 20 14:56:50.611257 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611210 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-registry-certificates\") pod \"ced62a51-e841-46d2-b901-6bf468781d83\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " Apr 20 14:56:50.611257 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611234 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-image-registry-private-configuration\") pod \"ced62a51-e841-46d2-b901-6bf468781d83\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " Apr 20 14:56:50.611257 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611260 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") pod \"ced62a51-e841-46d2-b901-6bf468781d83\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " Apr 20 14:56:50.611572 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611284 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-installation-pull-secrets\") pod \"ced62a51-e841-46d2-b901-6bf468781d83\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " Apr 20 14:56:50.611572 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611314 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ced62a51-e841-46d2-b901-6bf468781d83-ca-trust-extracted\") pod \"ced62a51-e841-46d2-b901-6bf468781d83\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " Apr 20 14:56:50.611572 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611425 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-bound-sa-token\") pod \"ced62a51-e841-46d2-b901-6bf468781d83\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " Apr 20 14:56:50.611572 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611501 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-trusted-ca\") pod \"ced62a51-e841-46d2-b901-6bf468781d83\" (UID: \"ced62a51-e841-46d2-b901-6bf468781d83\") " Apr 20 14:56:50.611770 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611621 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "ced62a51-e841-46d2-b901-6bf468781d83" (UID: "ced62a51-e841-46d2-b901-6bf468781d83"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:56:50.611946 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.611922 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-registry-certificates\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:56:50.612072 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.612051 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "ced62a51-e841-46d2-b901-6bf468781d83" (UID: "ced62a51-e841-46d2-b901-6bf468781d83"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:56:50.613923 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.613892 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "ced62a51-e841-46d2-b901-6bf468781d83" (UID: "ced62a51-e841-46d2-b901-6bf468781d83"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:56:50.614049 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.613935 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "ced62a51-e841-46d2-b901-6bf468781d83" (UID: "ced62a51-e841-46d2-b901-6bf468781d83"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:56:50.614049 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.613991 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "ced62a51-e841-46d2-b901-6bf468781d83" (UID: "ced62a51-e841-46d2-b901-6bf468781d83"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:56:50.614118 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.614057 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-kube-api-access-mlr95" (OuterVolumeSpecName: "kube-api-access-mlr95") pod "ced62a51-e841-46d2-b901-6bf468781d83" (UID: "ced62a51-e841-46d2-b901-6bf468781d83"). InnerVolumeSpecName "kube-api-access-mlr95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:56:50.614118 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.614070 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "ced62a51-e841-46d2-b901-6bf468781d83" (UID: "ced62a51-e841-46d2-b901-6bf468781d83"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:56:50.620235 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.620215 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ced62a51-e841-46d2-b901-6bf468781d83-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "ced62a51-e841-46d2-b901-6bf468781d83" (UID: "ced62a51-e841-46d2-b901-6bf468781d83"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:56:50.712576 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.712534 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlr95\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-kube-api-access-mlr95\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:56:50.712576 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.712573 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-image-registry-private-configuration\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:56:50.712769 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.712589 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-registry-tls\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:56:50.712769 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.712604 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ced62a51-e841-46d2-b901-6bf468781d83-installation-pull-secrets\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:56:50.712769 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.712618 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ced62a51-e841-46d2-b901-6bf468781d83-ca-trust-extracted\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:56:50.712769 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.712631 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ced62a51-e841-46d2-b901-6bf468781d83-bound-sa-token\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:56:50.712769 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:50.712645 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ced62a51-e841-46d2-b901-6bf468781d83-trusted-ca\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:56:51.417643 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:51.417603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" event={"ID":"ced62a51-e841-46d2-b901-6bf468781d83","Type":"ContainerDied","Data":"9ea390edf7e3ac89e8ffa21ddac341ffdd030bcd4f2dcf47fa3dc11d1a35e115"} Apr 20 14:56:51.417643 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:51.417646 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b6cbd5b9-fw9hv" Apr 20 14:56:51.418133 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:51.417655 2575 scope.go:117] "RemoveContainer" containerID="88a7a0c1664e24c0ebad907f50055fb2da587c85cae9f8510e8c6296dcda76e2" Apr 20 14:56:51.439174 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:51.439141 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b6cbd5b9-fw9hv"] Apr 20 14:56:51.441834 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:51.441808 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-b6cbd5b9-fw9hv"] Apr 20 14:56:52.868855 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:56:52.868820 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ced62a51-e841-46d2-b901-6bf468781d83" path="/var/lib/kubelet/pods/ced62a51-e841-46d2-b901-6bf468781d83/volumes" Apr 20 14:57:01.378788 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.378752 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f7f7cbc74-rdpfq"] Apr 20 14:57:01.379490 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.379201 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ced62a51-e841-46d2-b901-6bf468781d83" containerName="registry" Apr 20 14:57:01.379490 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.379223 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="ced62a51-e841-46d2-b901-6bf468781d83" containerName="registry" Apr 20 14:57:01.379490 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.379319 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="ced62a51-e841-46d2-b901-6bf468781d83" containerName="registry" Apr 20 14:57:01.382399 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.382378 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.384862 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.384839 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 14:57:01.384862 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.384854 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 14:57:01.385800 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.385774 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 14:57:01.385922 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.385810 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-hps5j\"" Apr 20 14:57:01.385922 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.385825 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 14:57:01.385922 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.385861 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 14:57:01.390211 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.390189 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f7f7cbc74-rdpfq"] Apr 20 14:57:01.410223 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.410190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-oauth-serving-cert\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.410340 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.410238 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-service-ca\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.410340 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.410280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-serving-cert\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.410340 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.410318 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nxgb\" (UniqueName: \"kubernetes.io/projected/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-kube-api-access-8nxgb\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.410485 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.410365 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-oauth-config\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.410485 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.410431 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-config\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.511342 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.511300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-oauth-serving-cert\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.511517 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.511348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-service-ca\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.511517 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.511389 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-serving-cert\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.511517 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.511428 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nxgb\" (UniqueName: \"kubernetes.io/projected/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-kube-api-access-8nxgb\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.511517 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.511475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-oauth-config\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.511517 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.511510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-config\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.512147 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.512117 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-oauth-serving-cert\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.512260 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.512191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-config\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.512260 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.512191 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-service-ca\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.514389 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.514365 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-serving-cert\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.514470 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.514442 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-oauth-config\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.520674 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.520650 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nxgb\" (UniqueName: \"kubernetes.io/projected/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-kube-api-access-8nxgb\") pod \"console-5f7f7cbc74-rdpfq\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:01.694366 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:01.694289 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:06.241609 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:06.241585 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f7f7cbc74-rdpfq"] Apr 20 14:57:06.244101 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:57:06.244075 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2b42fbd_08c4_47f6_9df2_afb02ec9dbba.slice/crio-8bc2f1a21c5c9057f1c92a30b3c2db1a3b8454e8446d6ae7f4e3ae9490b04259 WatchSource:0}: Error finding container 8bc2f1a21c5c9057f1c92a30b3c2db1a3b8454e8446d6ae7f4e3ae9490b04259: Status 404 returned error can't find the container with id 8bc2f1a21c5c9057f1c92a30b3c2db1a3b8454e8446d6ae7f4e3ae9490b04259 Apr 20 14:57:06.465670 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:06.465577 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f7f7cbc74-rdpfq" event={"ID":"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba","Type":"ContainerStarted","Data":"8bc2f1a21c5c9057f1c92a30b3c2db1a3b8454e8446d6ae7f4e3ae9490b04259"} Apr 20 14:57:06.467272 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:06.467235 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-v5stc" event={"ID":"4ccad5fe-a610-4440-ad4f-3cba3e926719","Type":"ContainerStarted","Data":"bbfabd349f87a9578ed01cac964ed3bb8d2feadfb145469bebb0566b6981f433"} Apr 20 14:57:06.467554 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:06.467532 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-v5stc" Apr 20 14:57:06.480962 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:06.480933 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-v5stc" Apr 20 14:57:06.483535 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:06.483495 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-v5stc" podStartSLOduration=1.459773267 podStartE2EDuration="17.483481292s" podCreationTimestamp="2026-04-20 14:56:49 +0000 UTC" firstStartedPulling="2026-04-20 14:56:50.199204457 +0000 UTC m=+181.891747936" lastFinishedPulling="2026-04-20 14:57:06.22291247 +0000 UTC m=+197.915455961" observedRunningTime="2026-04-20 14:57:06.482268066 +0000 UTC m=+198.174811579" watchObservedRunningTime="2026-04-20 14:57:06.483481292 +0000 UTC m=+198.176024790" Apr 20 14:57:10.463982 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.463932 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-699684764b-wwtbc"] Apr 20 14:57:10.488041 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.487993 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699684764b-wwtbc"] Apr 20 14:57:10.488213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.488046 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f7f7cbc74-rdpfq" event={"ID":"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba","Type":"ContainerStarted","Data":"3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52"} Apr 20 14:57:10.488213 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.488139 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.501438 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.501402 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 14:57:10.505814 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.505737 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f7f7cbc74-rdpfq" podStartSLOduration=5.92113673 podStartE2EDuration="9.505720646s" podCreationTimestamp="2026-04-20 14:57:01 +0000 UTC" firstStartedPulling="2026-04-20 14:57:06.24577096 +0000 UTC m=+197.938314440" lastFinishedPulling="2026-04-20 14:57:09.830354866 +0000 UTC m=+201.522898356" observedRunningTime="2026-04-20 14:57:10.505238173 +0000 UTC m=+202.197781686" watchObservedRunningTime="2026-04-20 14:57:10.505720646 +0000 UTC m=+202.198264148" Apr 20 14:57:10.590794 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.590754 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbq4d\" (UniqueName: \"kubernetes.io/projected/bee525db-20a4-49db-8d07-54d179347cf5-kube-api-access-qbq4d\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.590985 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.590825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-console-config\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.590985 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.590862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-oauth-config\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.590985 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.590898 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-oauth-serving-cert\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.591155 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.591042 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-trusted-ca-bundle\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.591155 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.591112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-service-ca\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.591221 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.591182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-serving-cert\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.692631 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.692587 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-oauth-serving-cert\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.692826 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.692657 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-trusted-ca-bundle\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.692826 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.692699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-service-ca\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.692826 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.692738 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-serving-cert\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.692826 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.692785 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbq4d\" (UniqueName: \"kubernetes.io/projected/bee525db-20a4-49db-8d07-54d179347cf5-kube-api-access-qbq4d\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.692826 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.692814 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-console-config\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.693049 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.692848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-oauth-config\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.693469 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.693423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-service-ca\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.693469 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.693422 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-oauth-serving-cert\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.693820 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.693634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-trusted-ca-bundle\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.693875 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.693829 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-console-config\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.695543 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.695520 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-oauth-config\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.695711 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.695690 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-serving-cert\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.701415 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.701388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbq4d\" (UniqueName: \"kubernetes.io/projected/bee525db-20a4-49db-8d07-54d179347cf5-kube-api-access-qbq4d\") pod \"console-699684764b-wwtbc\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.804786 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.804751 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:10.945877 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:10.945846 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699684764b-wwtbc"] Apr 20 14:57:10.948618 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:57:10.948583 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbee525db_20a4_49db_8d07_54d179347cf5.slice/crio-e534124a44d78a57381e8debfcbdc668a1ccc2f40c2302207b4d6dc307cc38d8 WatchSource:0}: Error finding container e534124a44d78a57381e8debfcbdc668a1ccc2f40c2302207b4d6dc307cc38d8: Status 404 returned error can't find the container with id e534124a44d78a57381e8debfcbdc668a1ccc2f40c2302207b4d6dc307cc38d8 Apr 20 14:57:11.487883 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:11.487796 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699684764b-wwtbc" event={"ID":"bee525db-20a4-49db-8d07-54d179347cf5","Type":"ContainerStarted","Data":"868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1"} Apr 20 14:57:11.487883 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:11.487843 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699684764b-wwtbc" event={"ID":"bee525db-20a4-49db-8d07-54d179347cf5","Type":"ContainerStarted","Data":"e534124a44d78a57381e8debfcbdc668a1ccc2f40c2302207b4d6dc307cc38d8"} Apr 20 14:57:11.509330 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:11.509282 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-699684764b-wwtbc" podStartSLOduration=1.509262413 podStartE2EDuration="1.509262413s" podCreationTimestamp="2026-04-20 14:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:57:11.508547373 +0000 UTC m=+203.201090873" watchObservedRunningTime="2026-04-20 14:57:11.509262413 +0000 UTC m=+203.201805915" Apr 20 14:57:11.695590 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:11.695550 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:11.695755 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:11.695629 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:11.701736 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:11.701713 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:12.495929 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:12.495899 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:13.496108 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:13.496076 2575 generic.go:358] "Generic (PLEG): container finished" podID="34f2a619-5bb8-4702-ae7e-217e448429bc" containerID="9f4484cde0bf271de562b1b10f7231b49bc53bda087aea8c4aaea7962d0dbacf" exitCode=0 Apr 20 14:57:13.496586 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:13.496165 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7rn4l" event={"ID":"34f2a619-5bb8-4702-ae7e-217e448429bc","Type":"ContainerDied","Data":"9f4484cde0bf271de562b1b10f7231b49bc53bda087aea8c4aaea7962d0dbacf"} Apr 20 14:57:13.496694 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:13.496677 2575 scope.go:117] "RemoveContainer" containerID="9f4484cde0bf271de562b1b10f7231b49bc53bda087aea8c4aaea7962d0dbacf" Apr 20 14:57:14.501409 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:14.501370 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7rn4l" event={"ID":"34f2a619-5bb8-4702-ae7e-217e448429bc","Type":"ContainerStarted","Data":"745dea9ef725f1915ff3ef367572c77780fc8ec06d3d04c9d6723a299148e601"} Apr 20 14:57:20.805341 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:20.805293 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:20.805840 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:20.805375 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:20.810375 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:20.810353 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:21.528661 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:21.528636 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:57:21.575226 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:21.575195 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f7f7cbc74-rdpfq"] Apr 20 14:57:46.600500 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:46.600452 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f7f7cbc74-rdpfq" podUID="a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" containerName="console" containerID="cri-o://3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52" gracePeriod=15 Apr 20 14:57:46.875458 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:46.875437 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f7f7cbc74-rdpfq_a2b42fbd-08c4-47f6-9df2-afb02ec9dbba/console/0.log" Apr 20 14:57:46.875582 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:46.875495 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:47.009372 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.009337 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-oauth-serving-cert\") pod \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " Apr 20 14:57:47.009372 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.009378 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nxgb\" (UniqueName: \"kubernetes.io/projected/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-kube-api-access-8nxgb\") pod \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " Apr 20 14:57:47.009610 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.009424 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-service-ca\") pod \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " Apr 20 14:57:47.009610 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.009452 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-serving-cert\") pod \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " Apr 20 14:57:47.009610 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.009480 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-oauth-config\") pod \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " Apr 20 14:57:47.009610 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.009541 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-config\") pod \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\" (UID: \"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba\") " Apr 20 14:57:47.009869 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.009841 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" (UID: "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:47.009923 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.009901 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-service-ca" (OuterVolumeSpecName: "service-ca") pod "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" (UID: "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:47.010006 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.009982 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-config" (OuterVolumeSpecName: "console-config") pod "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" (UID: "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:47.011876 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.011851 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" (UID: "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:47.011876 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.011875 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-kube-api-access-8nxgb" (OuterVolumeSpecName: "kube-api-access-8nxgb") pod "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" (UID: "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba"). InnerVolumeSpecName "kube-api-access-8nxgb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:47.012052 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.011898 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" (UID: "a2b42fbd-08c4-47f6-9df2-afb02ec9dbba"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:47.110551 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.110484 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-service-ca\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:47.110551 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.110509 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-serving-cert\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:47.110551 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.110519 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-oauth-config\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:47.110551 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.110529 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-console-config\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:47.110551 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.110538 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-oauth-serving-cert\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:47.110551 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.110546 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nxgb\" (UniqueName: \"kubernetes.io/projected/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba-kube-api-access-8nxgb\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:47.606409 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.606382 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f7f7cbc74-rdpfq_a2b42fbd-08c4-47f6-9df2-afb02ec9dbba/console/0.log" Apr 20 14:57:47.606871 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.606426 2575 generic.go:358] "Generic (PLEG): container finished" podID="a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" containerID="3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52" exitCode=2 Apr 20 14:57:47.606871 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.606463 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f7f7cbc74-rdpfq" event={"ID":"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba","Type":"ContainerDied","Data":"3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52"} Apr 20 14:57:47.606871 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.606498 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f7f7cbc74-rdpfq" Apr 20 14:57:47.606871 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.606514 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f7f7cbc74-rdpfq" event={"ID":"a2b42fbd-08c4-47f6-9df2-afb02ec9dbba","Type":"ContainerDied","Data":"8bc2f1a21c5c9057f1c92a30b3c2db1a3b8454e8446d6ae7f4e3ae9490b04259"} Apr 20 14:57:47.606871 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.606537 2575 scope.go:117] "RemoveContainer" containerID="3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52" Apr 20 14:57:47.614706 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.614691 2575 scope.go:117] "RemoveContainer" containerID="3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52" Apr 20 14:57:47.614972 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:57:47.614944 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52\": container with ID starting with 3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52 not found: ID does not exist" containerID="3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52" Apr 20 14:57:47.615054 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.614980 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52"} err="failed to get container status \"3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52\": rpc error: code = NotFound desc = could not find container \"3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52\": container with ID starting with 3b8fd6cd97e2c9c24512fb0ffeb8f37e23d2b193282990c4115b73aa7672ef52 not found: ID does not exist" Apr 20 14:57:47.626258 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.626236 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f7f7cbc74-rdpfq"] Apr 20 14:57:47.629910 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:47.629888 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f7f7cbc74-rdpfq"] Apr 20 14:57:48.868307 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:48.868273 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" path="/var/lib/kubelet/pods/a2b42fbd-08c4-47f6-9df2-afb02ec9dbba/volumes" Apr 20 14:57:49.972459 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:49.972430 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fb5cd5f68-msfxx"] Apr 20 14:57:49.972812 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:49.972714 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" containerName="console" Apr 20 14:57:49.972812 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:49.972724 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" containerName="console" Apr 20 14:57:49.972812 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:49.972810 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2b42fbd-08c4-47f6-9df2-afb02ec9dbba" containerName="console" Apr 20 14:57:49.998323 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:49.998294 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb5cd5f68-msfxx"] Apr 20 14:57:49.998467 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:49.998399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.037765 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.037734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-oauth-serving-cert\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.037889 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.037775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-trusted-ca-bundle\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.037889 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.037792 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-config\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.037889 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.037865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-service-ca\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.037993 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.037895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-oauth-config\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.037993 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.037941 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-serving-cert\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.037993 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.037958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxtll\" (UniqueName: \"kubernetes.io/projected/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-kube-api-access-zxtll\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.138800 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.138772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-trusted-ca-bundle\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.138800 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.138803 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-config\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.139047 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.138842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-service-ca\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.139047 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.138861 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-oauth-config\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.139047 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.138911 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-serving-cert\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.139047 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.138931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxtll\" (UniqueName: \"kubernetes.io/projected/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-kube-api-access-zxtll\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.139266 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.139107 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-oauth-serving-cert\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.139591 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.139566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-config\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.139672 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.139632 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-service-ca\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.139757 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.139741 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-trusted-ca-bundle\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.139978 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.139956 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-oauth-serving-cert\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.141510 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.141491 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-oauth-config\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.141589 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.141556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-serving-cert\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.146236 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.146219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxtll\" (UniqueName: \"kubernetes.io/projected/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-kube-api-access-zxtll\") pod \"console-6fb5cd5f68-msfxx\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.307617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.307577 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:57:50.425103 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.425071 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fb5cd5f68-msfxx"] Apr 20 14:57:50.427728 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:57:50.427699 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb13f18be_43ba_486b_9f7d_c6bb7f4b2688.slice/crio-4e02144d43272276c523c2e49e02dc8f9488b10171aa47615cb6239e743691ba WatchSource:0}: Error finding container 4e02144d43272276c523c2e49e02dc8f9488b10171aa47615cb6239e743691ba: Status 404 returned error can't find the container with id 4e02144d43272276c523c2e49e02dc8f9488b10171aa47615cb6239e743691ba Apr 20 14:57:50.618116 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.618001 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb5cd5f68-msfxx" event={"ID":"b13f18be-43ba-486b-9f7d-c6bb7f4b2688","Type":"ContainerStarted","Data":"b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9"} Apr 20 14:57:50.618116 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:50.618070 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb5cd5f68-msfxx" event={"ID":"b13f18be-43ba-486b-9f7d-c6bb7f4b2688","Type":"ContainerStarted","Data":"4e02144d43272276c523c2e49e02dc8f9488b10171aa47615cb6239e743691ba"} Apr 20 14:57:52.110504 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.110449 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fb5cd5f68-msfxx" podStartSLOduration=3.110429904 podStartE2EDuration="3.110429904s" podCreationTimestamp="2026-04-20 14:57:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:57:50.633285064 +0000 UTC m=+242.325828564" watchObservedRunningTime="2026-04-20 14:57:52.110429904 +0000 UTC m=+243.802973409" Apr 20 14:57:52.110970 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.110954 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:57:52.111409 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.111387 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="alertmanager" containerID="cri-o://bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1" gracePeriod=120 Apr 20 14:57:52.111483 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.111461 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy-web" containerID="cri-o://1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895" gracePeriod=120 Apr 20 14:57:52.111543 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.111472 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy-metric" containerID="cri-o://061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1" gracePeriod=120 Apr 20 14:57:52.111543 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.111481 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="config-reloader" containerID="cri-o://cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70" gracePeriod=120 Apr 20 14:57:52.111543 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.111530 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="prom-label-proxy" containerID="cri-o://326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b" gracePeriod=120 Apr 20 14:57:52.111679 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.111514 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy" containerID="cri-o://c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666" gracePeriod=120 Apr 20 14:57:52.626205 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.626168 2575 generic.go:358] "Generic (PLEG): container finished" podID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerID="326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b" exitCode=0 Apr 20 14:57:52.626205 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.626201 2575 generic.go:358] "Generic (PLEG): container finished" podID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerID="c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666" exitCode=0 Apr 20 14:57:52.626205 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.626210 2575 generic.go:358] "Generic (PLEG): container finished" podID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerID="cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70" exitCode=0 Apr 20 14:57:52.626434 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.626216 2575 generic.go:358] "Generic (PLEG): container finished" podID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerID="bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1" exitCode=0 Apr 20 14:57:52.626434 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.626233 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerDied","Data":"326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b"} Apr 20 14:57:52.626434 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.626271 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerDied","Data":"c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666"} Apr 20 14:57:52.626434 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.626282 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerDied","Data":"cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70"} Apr 20 14:57:52.626434 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:52.626292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerDied","Data":"bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1"} Apr 20 14:57:53.361117 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.361086 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.466553 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466468 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-main-tls\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.466553 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466502 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-metrics-client-ca\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.466553 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466533 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.466553 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466550 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7b2lk\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-kube-api-access-7b2lk\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.466832 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466570 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-web\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.466832 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466592 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-main-db\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.466832 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466706 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-out\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.466832 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466749 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-volume\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.466832 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466782 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-web-config\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.466832 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466809 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-trusted-ca-bundle\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.467171 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466836 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.467171 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466877 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-cluster-tls-config\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.467171 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466902 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-tls-assets\") pod \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\" (UID: \"d88c1951-f0da-4059-a1c4-e1ca624bee9b\") " Apr 20 14:57:53.467171 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.466933 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:53.467371 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.467199 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-metrics-client-ca\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.467371 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.467248 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:57:53.467612 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.467585 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:53.470013 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.469890 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:53.470013 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.469979 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:53.470211 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.470065 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-out" (OuterVolumeSpecName: "config-out") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:57:53.470211 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.470183 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:53.470411 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.470379 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-kube-api-access-7b2lk" (OuterVolumeSpecName: "kube-api-access-7b2lk") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "kube-api-access-7b2lk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:53.470489 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.470462 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-volume" (OuterVolumeSpecName: "config-volume") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:53.470609 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.470591 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:53.471463 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.471447 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:53.475274 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.475255 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:53.481493 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.481470 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-web-config" (OuterVolumeSpecName: "web-config") pod "d88c1951-f0da-4059-a1c4-e1ca624bee9b" (UID: "d88c1951-f0da-4059-a1c4-e1ca624bee9b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:53.568015 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.567978 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568015 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568012 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7b2lk\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-kube-api-access-7b2lk\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568015 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568040 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568015 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568050 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-main-db\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568060 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-out\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568069 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-config-volume\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568078 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-web-config\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568085 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c1951-f0da-4059-a1c4-e1ca624bee9b-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568094 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568103 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-cluster-tls-config\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568111 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d88c1951-f0da-4059-a1c4-e1ca624bee9b-tls-assets\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.568249 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.568120 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d88c1951-f0da-4059-a1c4-e1ca624bee9b-secret-alertmanager-main-tls\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:57:53.631627 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.631602 2575 generic.go:358] "Generic (PLEG): container finished" podID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerID="061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1" exitCode=0 Apr 20 14:57:53.631627 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.631625 2575 generic.go:358] "Generic (PLEG): container finished" podID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerID="1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895" exitCode=0 Apr 20 14:57:53.631783 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.631689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerDied","Data":"061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1"} Apr 20 14:57:53.631783 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.631728 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerDied","Data":"1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895"} Apr 20 14:57:53.631783 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.631739 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d88c1951-f0da-4059-a1c4-e1ca624bee9b","Type":"ContainerDied","Data":"d1e20ba3eb622a6fedf9296e3c8c0cb961b9d4690869acbefcd41bff7cdab5f7"} Apr 20 14:57:53.631783 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.631754 2575 scope.go:117] "RemoveContainer" containerID="326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b" Apr 20 14:57:53.631783 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.631697 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.639590 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.639571 2575 scope.go:117] "RemoveContainer" containerID="061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1" Apr 20 14:57:53.647038 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.646995 2575 scope.go:117] "RemoveContainer" containerID="c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666" Apr 20 14:57:53.653844 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.653815 2575 scope.go:117] "RemoveContainer" containerID="1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895" Apr 20 14:57:53.654882 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.654797 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:57:53.660783 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.660761 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:57:53.661268 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.661251 2575 scope.go:117] "RemoveContainer" containerID="cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70" Apr 20 14:57:53.667862 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.667846 2575 scope.go:117] "RemoveContainer" containerID="bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1" Apr 20 14:57:53.675572 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.675553 2575 scope.go:117] "RemoveContainer" containerID="29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0" Apr 20 14:57:53.682075 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.682057 2575 scope.go:117] "RemoveContainer" containerID="326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b" Apr 20 14:57:53.682307 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:57:53.682290 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b\": container with ID starting with 326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b not found: ID does not exist" containerID="326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b" Apr 20 14:57:53.682363 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.682316 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b"} err="failed to get container status \"326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b\": rpc error: code = NotFound desc = could not find container \"326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b\": container with ID starting with 326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b not found: ID does not exist" Apr 20 14:57:53.682363 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.682335 2575 scope.go:117] "RemoveContainer" containerID="061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1" Apr 20 14:57:53.682537 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:57:53.682520 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1\": container with ID starting with 061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1 not found: ID does not exist" containerID="061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1" Apr 20 14:57:53.682581 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.682547 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1"} err="failed to get container status \"061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1\": rpc error: code = NotFound desc = could not find container \"061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1\": container with ID starting with 061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1 not found: ID does not exist" Apr 20 14:57:53.682581 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.682563 2575 scope.go:117] "RemoveContainer" containerID="c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666" Apr 20 14:57:53.682736 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:57:53.682720 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666\": container with ID starting with c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666 not found: ID does not exist" containerID="c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666" Apr 20 14:57:53.682781 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.682738 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666"} err="failed to get container status \"c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666\": rpc error: code = NotFound desc = could not find container \"c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666\": container with ID starting with c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666 not found: ID does not exist" Apr 20 14:57:53.682781 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.682750 2575 scope.go:117] "RemoveContainer" containerID="1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895" Apr 20 14:57:53.682919 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:57:53.682903 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895\": container with ID starting with 1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895 not found: ID does not exist" containerID="1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895" Apr 20 14:57:53.682956 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.682923 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895"} err="failed to get container status \"1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895\": rpc error: code = NotFound desc = could not find container \"1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895\": container with ID starting with 1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895 not found: ID does not exist" Apr 20 14:57:53.682956 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.682936 2575 scope.go:117] "RemoveContainer" containerID="cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70" Apr 20 14:57:53.683189 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:57:53.683173 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70\": container with ID starting with cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70 not found: ID does not exist" containerID="cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70" Apr 20 14:57:53.683254 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683193 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70"} err="failed to get container status \"cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70\": rpc error: code = NotFound desc = could not find container \"cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70\": container with ID starting with cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70 not found: ID does not exist" Apr 20 14:57:53.683254 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683205 2575 scope.go:117] "RemoveContainer" containerID="bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1" Apr 20 14:57:53.683387 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:57:53.683371 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1\": container with ID starting with bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1 not found: ID does not exist" containerID="bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1" Apr 20 14:57:53.683426 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683392 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1"} err="failed to get container status \"bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1\": rpc error: code = NotFound desc = could not find container \"bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1\": container with ID starting with bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1 not found: ID does not exist" Apr 20 14:57:53.683426 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683407 2575 scope.go:117] "RemoveContainer" containerID="29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0" Apr 20 14:57:53.683629 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:57:53.683614 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0\": container with ID starting with 29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0 not found: ID does not exist" containerID="29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0" Apr 20 14:57:53.683678 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683631 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0"} err="failed to get container status \"29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0\": rpc error: code = NotFound desc = could not find container \"29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0\": container with ID starting with 29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0 not found: ID does not exist" Apr 20 14:57:53.683678 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683643 2575 scope.go:117] "RemoveContainer" containerID="326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b" Apr 20 14:57:53.683816 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683796 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b"} err="failed to get container status \"326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b\": rpc error: code = NotFound desc = could not find container \"326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b\": container with ID starting with 326f28f54092803e8c664d00f3e2b83de89f8b7af6e9fd9b6f7f0d8c88ed039b not found: ID does not exist" Apr 20 14:57:53.683853 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683817 2575 scope.go:117] "RemoveContainer" containerID="061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1" Apr 20 14:57:53.683968 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683942 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1"} err="failed to get container status \"061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1\": rpc error: code = NotFound desc = could not find container \"061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1\": container with ID starting with 061dac21422af691ea46283f3fa3e0c52a42295d30379d7f453040e01f84acc1 not found: ID does not exist" Apr 20 14:57:53.684054 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.683971 2575 scope.go:117] "RemoveContainer" containerID="c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666" Apr 20 14:57:53.684187 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.684169 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666"} err="failed to get container status \"c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666\": rpc error: code = NotFound desc = could not find container \"c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666\": container with ID starting with c92c16d9d3ba682458553d4c793d90211c7210f9f5195571350fc6cae7c80666 not found: ID does not exist" Apr 20 14:57:53.684254 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.684189 2575 scope.go:117] "RemoveContainer" containerID="1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895" Apr 20 14:57:53.684394 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.684378 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895"} err="failed to get container status \"1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895\": rpc error: code = NotFound desc = could not find container \"1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895\": container with ID starting with 1f5ec8d3337f2a67431a8c33016d9e181ed13a4fb434cfaac6e1145f51d13895 not found: ID does not exist" Apr 20 14:57:53.684438 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.684396 2575 scope.go:117] "RemoveContainer" containerID="cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70" Apr 20 14:57:53.684590 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.684573 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70"} err="failed to get container status \"cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70\": rpc error: code = NotFound desc = could not find container \"cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70\": container with ID starting with cc18bbf035dbcd543200002380d01276e92a45c7855a2a92439b5b8ad29b4e70 not found: ID does not exist" Apr 20 14:57:53.684655 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.684592 2575 scope.go:117] "RemoveContainer" containerID="bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1" Apr 20 14:57:53.684836 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.684818 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1"} err="failed to get container status \"bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1\": rpc error: code = NotFound desc = could not find container \"bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1\": container with ID starting with bdbd0922466bd26caff6fe8f40fc9f1771b17b6caa4691f8ca529e22b20518b1 not found: ID does not exist" Apr 20 14:57:53.684893 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.684837 2575 scope.go:117] "RemoveContainer" containerID="29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0" Apr 20 14:57:53.685011 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.684996 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0"} err="failed to get container status \"29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0\": rpc error: code = NotFound desc = could not find container \"29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0\": container with ID starting with 29098a48bc1ffba0b22fd7f87455a4ffd8c6d455113a5db71d774ab2c0df60d0 not found: ID does not exist" Apr 20 14:57:53.687905 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.687885 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:57:53.688262 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688246 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="prom-label-proxy" Apr 20 14:57:53.688309 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688266 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="prom-label-proxy" Apr 20 14:57:53.688309 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688278 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="init-config-reloader" Apr 20 14:57:53.688309 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688284 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="init-config-reloader" Apr 20 14:57:53.688309 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688292 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy" Apr 20 14:57:53.688309 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688297 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy" Apr 20 14:57:53.688309 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688305 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy-metric" Apr 20 14:57:53.688309 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688310 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy-metric" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688320 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="alertmanager" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688325 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="alertmanager" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688335 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="config-reloader" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688340 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="config-reloader" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688345 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy-web" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688350 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy-web" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688398 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="config-reloader" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688406 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy-web" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688412 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688419 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="alertmanager" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688424 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="prom-label-proxy" Apr 20 14:57:53.688514 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.688431 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" containerName="kube-rbac-proxy-metric" Apr 20 14:57:53.693352 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.693338 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.695927 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.695906 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 14:57:53.696047 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.695906 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 14:57:53.696047 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.695924 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 14:57:53.696047 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.695974 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 14:57:53.696294 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.696277 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 14:57:53.696363 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.696320 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 14:57:53.696363 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.696310 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 14:57:53.696467 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.696367 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 14:57:53.696467 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.696410 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-6v6dq\"" Apr 20 14:57:53.701860 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.701842 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 14:57:53.705786 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.705768 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:57:53.769510 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769440 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22424d77-53b6-4c1b-9c41-b2f13a205d99-tls-assets\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769510 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769474 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769651 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22424d77-53b6-4c1b-9c41-b2f13a205d99-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769651 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769583 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769651 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/22424d77-53b6-4c1b-9c41-b2f13a205d99-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769651 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw6hz\" (UniqueName: \"kubernetes.io/projected/22424d77-53b6-4c1b-9c41-b2f13a205d99-kube-api-access-xw6hz\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769651 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769650 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769740 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-config-volume\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769767 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22424d77-53b6-4c1b-9c41-b2f13a205d99-config-out\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769790 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-web-config\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.769859 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.769843 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22424d77-53b6-4c1b-9c41-b2f13a205d99-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.870541 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.870541 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/22424d77-53b6-4c1b-9c41-b2f13a205d99-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.870749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xw6hz\" (UniqueName: \"kubernetes.io/projected/22424d77-53b6-4c1b-9c41-b2f13a205d99-kube-api-access-xw6hz\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.870749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.870749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.870749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-config-volume\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.870749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870682 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22424d77-53b6-4c1b-9c41-b2f13a205d99-config-out\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.870749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-web-config\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.870749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.871114 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22424d77-53b6-4c1b-9c41-b2f13a205d99-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.871114 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870812 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22424d77-53b6-4c1b-9c41-b2f13a205d99-tls-assets\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.871114 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870833 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.871114 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870880 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22424d77-53b6-4c1b-9c41-b2f13a205d99-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.871114 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.870938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/22424d77-53b6-4c1b-9c41-b2f13a205d99-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.872082 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.872056 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/22424d77-53b6-4c1b-9c41-b2f13a205d99-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.872380 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.872353 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/22424d77-53b6-4c1b-9c41-b2f13a205d99-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.874345 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.874194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22424d77-53b6-4c1b-9c41-b2f13a205d99-tls-assets\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.874345 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.874286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-web-config\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.874345 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.874325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.874556 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.874415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.874556 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.874454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.874556 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.874453 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.874709 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.874678 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.874709 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.874685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22424d77-53b6-4c1b-9c41-b2f13a205d99-config-out\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.875642 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.875625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/22424d77-53b6-4c1b-9c41-b2f13a205d99-config-volume\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:53.878188 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:53.878165 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw6hz\" (UniqueName: \"kubernetes.io/projected/22424d77-53b6-4c1b-9c41-b2f13a205d99-kube-api-access-xw6hz\") pod \"alertmanager-main-0\" (UID: \"22424d77-53b6-4c1b-9c41-b2f13a205d99\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:54.002440 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:54.002412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:57:54.127726 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:54.127702 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:57:54.130002 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:57:54.129975 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22424d77_53b6_4c1b_9c41_b2f13a205d99.slice/crio-a96368abd47be4421b136642b7e1d7b0bd5d0a422a3418147b67712c87e46351 WatchSource:0}: Error finding container a96368abd47be4421b136642b7e1d7b0bd5d0a422a3418147b67712c87e46351: Status 404 returned error can't find the container with id a96368abd47be4421b136642b7e1d7b0bd5d0a422a3418147b67712c87e46351 Apr 20 14:57:54.635529 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:54.635501 2575 generic.go:358] "Generic (PLEG): container finished" podID="22424d77-53b6-4c1b-9c41-b2f13a205d99" containerID="8fa1944dbc16119bca127a924a10d78a08db71f6d987371a31dcb573912f2348" exitCode=0 Apr 20 14:57:54.635941 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:54.635590 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22424d77-53b6-4c1b-9c41-b2f13a205d99","Type":"ContainerDied","Data":"8fa1944dbc16119bca127a924a10d78a08db71f6d987371a31dcb573912f2348"} Apr 20 14:57:54.635941 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:54.635623 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22424d77-53b6-4c1b-9c41-b2f13a205d99","Type":"ContainerStarted","Data":"a96368abd47be4421b136642b7e1d7b0bd5d0a422a3418147b67712c87e46351"} Apr 20 14:57:54.868350 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:54.868325 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d88c1951-f0da-4059-a1c4-e1ca624bee9b" path="/var/lib/kubelet/pods/d88c1951-f0da-4059-a1c4-e1ca624bee9b/volumes" Apr 20 14:57:55.642139 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:55.642104 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22424d77-53b6-4c1b-9c41-b2f13a205d99","Type":"ContainerStarted","Data":"02c50752107fd670fcec2f7454c73d1fc8779edf4252f2ea4e55418e10bd5264"} Apr 20 14:57:55.642139 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:55.642143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22424d77-53b6-4c1b-9c41-b2f13a205d99","Type":"ContainerStarted","Data":"b534bd0254720a72884810280347337d0bb37399562acbc68cef8979f48ccafa"} Apr 20 14:57:55.642528 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:55.642158 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22424d77-53b6-4c1b-9c41-b2f13a205d99","Type":"ContainerStarted","Data":"08ca3b105c4b3e45fc0237eed766d486f1d57d6ffef0bf75c0cf7bb374778911"} Apr 20 14:57:55.642528 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:55.642171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22424d77-53b6-4c1b-9c41-b2f13a205d99","Type":"ContainerStarted","Data":"9800a3441f509508dc5dde1fd5ff66c02aa17a0b56b8d195b8da690eddd8dc33"} Apr 20 14:57:55.642528 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:55.642183 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22424d77-53b6-4c1b-9c41-b2f13a205d99","Type":"ContainerStarted","Data":"6dedfb870aff0a527144e1dd64882c04e819b417639fa3e88956bcdae6512777"} Apr 20 14:57:55.642528 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:55.642197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"22424d77-53b6-4c1b-9c41-b2f13a205d99","Type":"ContainerStarted","Data":"75ab2d90cfceacff2370f122cf3bab8370abd1642909bca52e1558d21fc37e62"} Apr 20 14:57:55.668044 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:57:55.667979 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.667965793 podStartE2EDuration="2.667965793s" podCreationTimestamp="2026-04-20 14:57:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:57:55.666317349 +0000 UTC m=+247.358860848" watchObservedRunningTime="2026-04-20 14:57:55.667965793 +0000 UTC m=+247.360509291" Apr 20 14:58:00.307713 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:00.307672 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:58:00.308093 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:00.307741 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:58:00.312515 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:00.312492 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:58:00.662606 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:00.662535 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 14:58:00.704666 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:00.704638 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699684764b-wwtbc"] Apr 20 14:58:00.727660 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:00.727621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:58:00.730190 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:00.730164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2439f5-03aa-43b8-9466-c01fbcb53912-metrics-certs\") pod \"network-metrics-daemon-gpcl9\" (UID: \"ae2439f5-03aa-43b8-9466-c01fbcb53912\") " pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:58:00.968381 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:00.968303 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ns2f2\"" Apr 20 14:58:00.976256 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:00.976238 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gpcl9" Apr 20 14:58:01.090107 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:01.090084 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gpcl9"] Apr 20 14:58:01.092666 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:58:01.092639 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2439f5_03aa_43b8_9466_c01fbcb53912.slice/crio-342d90456317e64ff3fba8b573cb47b65564003db1dfa5146c2271c5392c9278 WatchSource:0}: Error finding container 342d90456317e64ff3fba8b573cb47b65564003db1dfa5146c2271c5392c9278: Status 404 returned error can't find the container with id 342d90456317e64ff3fba8b573cb47b65564003db1dfa5146c2271c5392c9278 Apr 20 14:58:01.662988 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:01.662898 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gpcl9" event={"ID":"ae2439f5-03aa-43b8-9466-c01fbcb53912","Type":"ContainerStarted","Data":"342d90456317e64ff3fba8b573cb47b65564003db1dfa5146c2271c5392c9278"} Apr 20 14:58:02.669900 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:02.669865 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gpcl9" event={"ID":"ae2439f5-03aa-43b8-9466-c01fbcb53912","Type":"ContainerStarted","Data":"0b442569265d06b3c9083b845067c7c602dd699d078688eaf0ba598ddeb034f9"} Apr 20 14:58:02.669900 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:02.669904 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gpcl9" event={"ID":"ae2439f5-03aa-43b8-9466-c01fbcb53912","Type":"ContainerStarted","Data":"e912a93b8a6312b10866598f6e0bd8e078192c68223590516420168703aedfb7"} Apr 20 14:58:02.685183 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:02.685138 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gpcl9" podStartSLOduration=252.736449606 podStartE2EDuration="4m13.685123972s" podCreationTimestamp="2026-04-20 14:53:49 +0000 UTC" firstStartedPulling="2026-04-20 14:58:01.094305551 +0000 UTC m=+252.786849031" lastFinishedPulling="2026-04-20 14:58:02.042979919 +0000 UTC m=+253.735523397" observedRunningTime="2026-04-20 14:58:02.683964136 +0000 UTC m=+254.376507631" watchObservedRunningTime="2026-04-20 14:58:02.685123972 +0000 UTC m=+254.377667472" Apr 20 14:58:19.143666 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.143626 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-r72kh"] Apr 20 14:58:19.147949 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.147925 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.150444 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.150422 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:58:19.153608 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.153583 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r72kh"] Apr 20 14:58:19.281498 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.281463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/419c2e41-ed48-42a7-81ae-10358a918874-dbus\") pod \"global-pull-secret-syncer-r72kh\" (UID: \"419c2e41-ed48-42a7-81ae-10358a918874\") " pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.281498 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.281500 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419c2e41-ed48-42a7-81ae-10358a918874-original-pull-secret\") pod \"global-pull-secret-syncer-r72kh\" (UID: \"419c2e41-ed48-42a7-81ae-10358a918874\") " pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.281707 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.281552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/419c2e41-ed48-42a7-81ae-10358a918874-kubelet-config\") pod \"global-pull-secret-syncer-r72kh\" (UID: \"419c2e41-ed48-42a7-81ae-10358a918874\") " pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.382838 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.382808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/419c2e41-ed48-42a7-81ae-10358a918874-dbus\") pod \"global-pull-secret-syncer-r72kh\" (UID: \"419c2e41-ed48-42a7-81ae-10358a918874\") " pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.382838 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.382840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419c2e41-ed48-42a7-81ae-10358a918874-original-pull-secret\") pod \"global-pull-secret-syncer-r72kh\" (UID: \"419c2e41-ed48-42a7-81ae-10358a918874\") " pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.383062 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.382896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/419c2e41-ed48-42a7-81ae-10358a918874-kubelet-config\") pod \"global-pull-secret-syncer-r72kh\" (UID: \"419c2e41-ed48-42a7-81ae-10358a918874\") " pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.383062 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.382962 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/419c2e41-ed48-42a7-81ae-10358a918874-kubelet-config\") pod \"global-pull-secret-syncer-r72kh\" (UID: \"419c2e41-ed48-42a7-81ae-10358a918874\") " pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.383062 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.383000 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/419c2e41-ed48-42a7-81ae-10358a918874-dbus\") pod \"global-pull-secret-syncer-r72kh\" (UID: \"419c2e41-ed48-42a7-81ae-10358a918874\") " pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.385235 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.385217 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/419c2e41-ed48-42a7-81ae-10358a918874-original-pull-secret\") pod \"global-pull-secret-syncer-r72kh\" (UID: \"419c2e41-ed48-42a7-81ae-10358a918874\") " pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.457448 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.457364 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-r72kh" Apr 20 14:58:19.574792 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.574766 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-r72kh"] Apr 20 14:58:19.577098 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:58:19.577071 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod419c2e41_ed48_42a7_81ae_10358a918874.slice/crio-d6f6fd8fc1a1a36e51277b6da33f27e5f32b13b4d67cd1017ec1deeb767a8644 WatchSource:0}: Error finding container d6f6fd8fc1a1a36e51277b6da33f27e5f32b13b4d67cd1017ec1deeb767a8644: Status 404 returned error can't find the container with id d6f6fd8fc1a1a36e51277b6da33f27e5f32b13b4d67cd1017ec1deeb767a8644 Apr 20 14:58:19.718802 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:19.718717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r72kh" event={"ID":"419c2e41-ed48-42a7-81ae-10358a918874","Type":"ContainerStarted","Data":"d6f6fd8fc1a1a36e51277b6da33f27e5f32b13b4d67cd1017ec1deeb767a8644"} Apr 20 14:58:24.735259 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:24.735217 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-r72kh" event={"ID":"419c2e41-ed48-42a7-81ae-10358a918874","Type":"ContainerStarted","Data":"019601bf836948753b34d23196e3ebfe77abea2832ff0de00ab025449907541f"} Apr 20 14:58:24.753102 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:24.753049 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-r72kh" podStartSLOduration=1.694131713 podStartE2EDuration="5.753015535s" podCreationTimestamp="2026-04-20 14:58:19 +0000 UTC" firstStartedPulling="2026-04-20 14:58:19.5790538 +0000 UTC m=+271.271597277" lastFinishedPulling="2026-04-20 14:58:23.637937622 +0000 UTC m=+275.330481099" observedRunningTime="2026-04-20 14:58:24.752729812 +0000 UTC m=+276.445273312" watchObservedRunningTime="2026-04-20 14:58:24.753015535 +0000 UTC m=+276.445559041" Apr 20 14:58:25.726292 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:25.726247 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-699684764b-wwtbc" podUID="bee525db-20a4-49db-8d07-54d179347cf5" containerName="console" containerID="cri-o://868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1" gracePeriod=15 Apr 20 14:58:25.961067 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:25.961044 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699684764b-wwtbc_bee525db-20a4-49db-8d07-54d179347cf5/console/0.log" Apr 20 14:58:25.961350 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:25.961107 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:58:26.041641 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.041614 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-console-config\") pod \"bee525db-20a4-49db-8d07-54d179347cf5\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " Apr 20 14:58:26.041793 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.041658 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-oauth-serving-cert\") pod \"bee525db-20a4-49db-8d07-54d179347cf5\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " Apr 20 14:58:26.041793 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.041686 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbq4d\" (UniqueName: \"kubernetes.io/projected/bee525db-20a4-49db-8d07-54d179347cf5-kube-api-access-qbq4d\") pod \"bee525db-20a4-49db-8d07-54d179347cf5\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " Apr 20 14:58:26.041793 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.041702 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-service-ca\") pod \"bee525db-20a4-49db-8d07-54d179347cf5\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " Apr 20 14:58:26.041793 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.041742 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-oauth-config\") pod \"bee525db-20a4-49db-8d07-54d179347cf5\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " Apr 20 14:58:26.041793 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.041761 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-serving-cert\") pod \"bee525db-20a4-49db-8d07-54d179347cf5\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " Apr 20 14:58:26.042011 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.041813 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-trusted-ca-bundle\") pod \"bee525db-20a4-49db-8d07-54d179347cf5\" (UID: \"bee525db-20a4-49db-8d07-54d179347cf5\") " Apr 20 14:58:26.042100 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.042033 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-console-config" (OuterVolumeSpecName: "console-config") pod "bee525db-20a4-49db-8d07-54d179347cf5" (UID: "bee525db-20a4-49db-8d07-54d179347cf5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:26.042157 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.042091 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bee525db-20a4-49db-8d07-54d179347cf5" (UID: "bee525db-20a4-49db-8d07-54d179347cf5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:26.042207 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.042166 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-service-ca" (OuterVolumeSpecName: "service-ca") pod "bee525db-20a4-49db-8d07-54d179347cf5" (UID: "bee525db-20a4-49db-8d07-54d179347cf5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:26.042403 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.042371 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bee525db-20a4-49db-8d07-54d179347cf5" (UID: "bee525db-20a4-49db-8d07-54d179347cf5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:26.044068 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.044046 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bee525db-20a4-49db-8d07-54d179347cf5" (UID: "bee525db-20a4-49db-8d07-54d179347cf5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:58:26.044161 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.044143 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee525db-20a4-49db-8d07-54d179347cf5-kube-api-access-qbq4d" (OuterVolumeSpecName: "kube-api-access-qbq4d") pod "bee525db-20a4-49db-8d07-54d179347cf5" (UID: "bee525db-20a4-49db-8d07-54d179347cf5"). InnerVolumeSpecName "kube-api-access-qbq4d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:58:26.044295 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.044269 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bee525db-20a4-49db-8d07-54d179347cf5" (UID: "bee525db-20a4-49db-8d07-54d179347cf5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:58:26.142752 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.142718 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qbq4d\" (UniqueName: \"kubernetes.io/projected/bee525db-20a4-49db-8d07-54d179347cf5-kube-api-access-qbq4d\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:58:26.142752 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.142748 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-service-ca\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:58:26.142752 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.142758 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-oauth-config\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:58:26.142957 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.142767 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bee525db-20a4-49db-8d07-54d179347cf5-console-serving-cert\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:58:26.142957 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.142776 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-trusted-ca-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:58:26.142957 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.142785 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-console-config\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:58:26.142957 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.142793 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bee525db-20a4-49db-8d07-54d179347cf5-oauth-serving-cert\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:58:26.742737 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.742708 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699684764b-wwtbc_bee525db-20a4-49db-8d07-54d179347cf5/console/0.log" Apr 20 14:58:26.742935 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.742749 2575 generic.go:358] "Generic (PLEG): container finished" podID="bee525db-20a4-49db-8d07-54d179347cf5" containerID="868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1" exitCode=2 Apr 20 14:58:26.742935 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.742787 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699684764b-wwtbc" event={"ID":"bee525db-20a4-49db-8d07-54d179347cf5","Type":"ContainerDied","Data":"868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1"} Apr 20 14:58:26.742935 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.742833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699684764b-wwtbc" event={"ID":"bee525db-20a4-49db-8d07-54d179347cf5","Type":"ContainerDied","Data":"e534124a44d78a57381e8debfcbdc668a1ccc2f40c2302207b4d6dc307cc38d8"} Apr 20 14:58:26.742935 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.742846 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699684764b-wwtbc" Apr 20 14:58:26.742935 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.742854 2575 scope.go:117] "RemoveContainer" containerID="868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1" Apr 20 14:58:26.751383 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.751367 2575 scope.go:117] "RemoveContainer" containerID="868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1" Apr 20 14:58:26.751608 ip-10-0-140-93 kubenswrapper[2575]: E0420 14:58:26.751588 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1\": container with ID starting with 868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1 not found: ID does not exist" containerID="868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1" Apr 20 14:58:26.751671 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.751617 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1"} err="failed to get container status \"868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1\": rpc error: code = NotFound desc = could not find container \"868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1\": container with ID starting with 868b477139725d98b3a0c34b5a78eaf029af115d96579875eca546441ff868b1 not found: ID does not exist" Apr 20 14:58:26.763676 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.763646 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699684764b-wwtbc"] Apr 20 14:58:26.766823 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.766805 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-699684764b-wwtbc"] Apr 20 14:58:26.868401 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:26.868370 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee525db-20a4-49db-8d07-54d179347cf5" path="/var/lib/kubelet/pods/bee525db-20a4-49db-8d07-54d179347cf5/volumes" Apr 20 14:58:48.755638 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:48.755604 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 14:58:48.756649 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:48.756460 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 14:58:48.759229 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:48.759210 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 14:58:52.654886 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.654851 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w"] Apr 20 14:58:52.657258 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.655235 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bee525db-20a4-49db-8d07-54d179347cf5" containerName="console" Apr 20 14:58:52.657258 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.655248 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee525db-20a4-49db-8d07-54d179347cf5" containerName="console" Apr 20 14:58:52.657258 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.655308 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bee525db-20a4-49db-8d07-54d179347cf5" containerName="console" Apr 20 14:58:52.658200 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.658185 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.660672 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.660640 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:58:52.660672 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.660660 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:58:52.661568 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.661555 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ckg7g\"" Apr 20 14:58:52.665266 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.665188 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w"] Apr 20 14:58:52.760070 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.760038 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.760236 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.760130 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkhfg\" (UniqueName: \"kubernetes.io/projected/4f47e90f-1642-47f3-ac0d-058356bc03be-kube-api-access-zkhfg\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.760236 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.760157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.861146 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.861115 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkhfg\" (UniqueName: \"kubernetes.io/projected/4f47e90f-1642-47f3-ac0d-058356bc03be-kube-api-access-zkhfg\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.861146 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.861146 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.861325 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.861193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.861594 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.861577 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.862158 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.862143 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.868998 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.868973 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkhfg\" (UniqueName: \"kubernetes.io/projected/4f47e90f-1642-47f3-ac0d-058356bc03be-kube-api-access-zkhfg\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:52.968313 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:52.968237 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:58:53.087652 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:53.087630 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w"] Apr 20 14:58:53.090185 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:58:53.090148 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f47e90f_1642_47f3_ac0d_058356bc03be.slice/crio-fbadf1f7db38a6d28d80198a1af40fd435f8aee8ae00a526a62e3207aa915d0c WatchSource:0}: Error finding container fbadf1f7db38a6d28d80198a1af40fd435f8aee8ae00a526a62e3207aa915d0c: Status 404 returned error can't find the container with id fbadf1f7db38a6d28d80198a1af40fd435f8aee8ae00a526a62e3207aa915d0c Apr 20 14:58:53.092063 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:53.092041 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:58:53.819546 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:53.819508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" event={"ID":"4f47e90f-1642-47f3-ac0d-058356bc03be","Type":"ContainerStarted","Data":"fbadf1f7db38a6d28d80198a1af40fd435f8aee8ae00a526a62e3207aa915d0c"} Apr 20 14:58:59.839850 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:59.839809 2575 generic.go:358] "Generic (PLEG): container finished" podID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerID="3ec28270c45e56bdc35df1db33163050aada476ba8ed58fab486ba3b0e1bba3f" exitCode=0 Apr 20 14:58:59.840283 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:58:59.839862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" event={"ID":"4f47e90f-1642-47f3-ac0d-058356bc03be","Type":"ContainerDied","Data":"3ec28270c45e56bdc35df1db33163050aada476ba8ed58fab486ba3b0e1bba3f"} Apr 20 14:59:02.850931 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:02.850896 2575 generic.go:358] "Generic (PLEG): container finished" podID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerID="81ff9082afb7152f25ea8f47d8eb1bd261544ea5a773a7b7f4d3351c863a3aaf" exitCode=0 Apr 20 14:59:02.851340 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:02.850983 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" event={"ID":"4f47e90f-1642-47f3-ac0d-058356bc03be","Type":"ContainerDied","Data":"81ff9082afb7152f25ea8f47d8eb1bd261544ea5a773a7b7f4d3351c863a3aaf"} Apr 20 14:59:09.874958 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:09.874912 2575 generic.go:358] "Generic (PLEG): container finished" podID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerID="0ba20049765d993f5d83bfe236f27f411ec714157dd69a2da9f65459d429f640" exitCode=0 Apr 20 14:59:09.875330 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:09.874965 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" event={"ID":"4f47e90f-1642-47f3-ac0d-058356bc03be","Type":"ContainerDied","Data":"0ba20049765d993f5d83bfe236f27f411ec714157dd69a2da9f65459d429f640"} Apr 20 14:59:11.001741 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.001718 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:59:11.114587 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.114546 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-util\") pod \"4f47e90f-1642-47f3-ac0d-058356bc03be\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " Apr 20 14:59:11.114749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.114629 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkhfg\" (UniqueName: \"kubernetes.io/projected/4f47e90f-1642-47f3-ac0d-058356bc03be-kube-api-access-zkhfg\") pod \"4f47e90f-1642-47f3-ac0d-058356bc03be\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " Apr 20 14:59:11.114749 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.114681 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-bundle\") pod \"4f47e90f-1642-47f3-ac0d-058356bc03be\" (UID: \"4f47e90f-1642-47f3-ac0d-058356bc03be\") " Apr 20 14:59:11.115335 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.115309 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-bundle" (OuterVolumeSpecName: "bundle") pod "4f47e90f-1642-47f3-ac0d-058356bc03be" (UID: "4f47e90f-1642-47f3-ac0d-058356bc03be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:59:11.116987 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.116965 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f47e90f-1642-47f3-ac0d-058356bc03be-kube-api-access-zkhfg" (OuterVolumeSpecName: "kube-api-access-zkhfg") pod "4f47e90f-1642-47f3-ac0d-058356bc03be" (UID: "4f47e90f-1642-47f3-ac0d-058356bc03be"). InnerVolumeSpecName "kube-api-access-zkhfg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:59:11.118433 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.118407 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-util" (OuterVolumeSpecName: "util") pod "4f47e90f-1642-47f3-ac0d-058356bc03be" (UID: "4f47e90f-1642-47f3-ac0d-058356bc03be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:59:11.215281 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.215199 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:11.215281 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.215229 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkhfg\" (UniqueName: \"kubernetes.io/projected/4f47e90f-1642-47f3-ac0d-058356bc03be-kube-api-access-zkhfg\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:11.215281 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.215240 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f47e90f-1642-47f3-ac0d-058356bc03be-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:11.882359 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.882330 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" Apr 20 14:59:11.882359 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.882345 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xf67w" event={"ID":"4f47e90f-1642-47f3-ac0d-058356bc03be","Type":"ContainerDied","Data":"fbadf1f7db38a6d28d80198a1af40fd435f8aee8ae00a526a62e3207aa915d0c"} Apr 20 14:59:11.882565 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:11.882373 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbadf1f7db38a6d28d80198a1af40fd435f8aee8ae00a526a62e3207aa915d0c" Apr 20 14:59:15.151100 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.151061 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs"] Apr 20 14:59:15.151454 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.151381 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerName="pull" Apr 20 14:59:15.151454 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.151393 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerName="pull" Apr 20 14:59:15.151454 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.151406 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerName="extract" Apr 20 14:59:15.151454 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.151412 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerName="extract" Apr 20 14:59:15.151454 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.151426 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerName="util" Apr 20 14:59:15.151454 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.151432 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerName="util" Apr 20 14:59:15.151638 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.151480 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f47e90f-1642-47f3-ac0d-058356bc03be" containerName="extract" Apr 20 14:59:15.155728 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.155712 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" Apr 20 14:59:15.158317 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.158289 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 14:59:15.158317 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.158289 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:59:15.158494 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.158329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-vl2s5\"" Apr 20 14:59:15.166307 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.166287 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs"] Apr 20 14:59:15.250856 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.250818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09590223-0dd5-4270-84e0-9c76d6d7ac46-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-b4jjs\" (UID: \"09590223-0dd5-4270-84e0-9c76d6d7ac46\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" Apr 20 14:59:15.251001 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.250866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6ghw\" (UniqueName: \"kubernetes.io/projected/09590223-0dd5-4270-84e0-9c76d6d7ac46-kube-api-access-t6ghw\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-b4jjs\" (UID: \"09590223-0dd5-4270-84e0-9c76d6d7ac46\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" Apr 20 14:59:15.351966 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.351933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6ghw\" (UniqueName: \"kubernetes.io/projected/09590223-0dd5-4270-84e0-9c76d6d7ac46-kube-api-access-t6ghw\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-b4jjs\" (UID: \"09590223-0dd5-4270-84e0-9c76d6d7ac46\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" Apr 20 14:59:15.352136 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.352036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09590223-0dd5-4270-84e0-9c76d6d7ac46-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-b4jjs\" (UID: \"09590223-0dd5-4270-84e0-9c76d6d7ac46\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" Apr 20 14:59:15.352386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.352370 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/09590223-0dd5-4270-84e0-9c76d6d7ac46-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-b4jjs\" (UID: \"09590223-0dd5-4270-84e0-9c76d6d7ac46\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" Apr 20 14:59:15.360386 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.360350 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6ghw\" (UniqueName: \"kubernetes.io/projected/09590223-0dd5-4270-84e0-9c76d6d7ac46-kube-api-access-t6ghw\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-b4jjs\" (UID: \"09590223-0dd5-4270-84e0-9c76d6d7ac46\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" Apr 20 14:59:15.466366 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.466291 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" Apr 20 14:59:15.590947 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.590919 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs"] Apr 20 14:59:15.594046 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:59:15.593991 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09590223_0dd5_4270_84e0_9c76d6d7ac46.slice/crio-fe3517a871cdfd44cdf10b14835380619fbc9cc7916fd6ca6ce128972e46eae0 WatchSource:0}: Error finding container fe3517a871cdfd44cdf10b14835380619fbc9cc7916fd6ca6ce128972e46eae0: Status 404 returned error can't find the container with id fe3517a871cdfd44cdf10b14835380619fbc9cc7916fd6ca6ce128972e46eae0 Apr 20 14:59:15.896305 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:15.896269 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" event={"ID":"09590223-0dd5-4270-84e0-9c76d6d7ac46","Type":"ContainerStarted","Data":"fe3517a871cdfd44cdf10b14835380619fbc9cc7916fd6ca6ce128972e46eae0"} Apr 20 14:59:17.904995 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:17.904957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" event={"ID":"09590223-0dd5-4270-84e0-9c76d6d7ac46","Type":"ContainerStarted","Data":"5a684fd72aa5de8039011745f339c03b3b081c3270fa9f4f222e8029b5736367"} Apr 20 14:59:17.926056 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:17.925986 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-b4jjs" podStartSLOduration=1.194028179 podStartE2EDuration="2.925967994s" podCreationTimestamp="2026-04-20 14:59:15 +0000 UTC" firstStartedPulling="2026-04-20 14:59:15.596393507 +0000 UTC m=+327.288936984" lastFinishedPulling="2026-04-20 14:59:17.328333323 +0000 UTC m=+329.020876799" observedRunningTime="2026-04-20 14:59:17.923506509 +0000 UTC m=+329.616050009" watchObservedRunningTime="2026-04-20 14:59:17.925967994 +0000 UTC m=+329.618511494" Apr 20 14:59:18.829649 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:18.829614 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb"] Apr 20 14:59:18.832993 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:18.832970 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:18.835433 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:18.835405 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:59:18.836390 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:18.836364 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ckg7g\"" Apr 20 14:59:18.836490 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:18.836364 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:59:18.840611 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:18.840583 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb"] Apr 20 14:59:18.983229 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:18.983187 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b7xs\" (UniqueName: \"kubernetes.io/projected/726fce48-e508-45d1-9941-45968ca0ba56-kube-api-access-8b7xs\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:18.983582 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:18.983433 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:18.983582 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:18.983489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:19.084237 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.084145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:19.084237 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.084189 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:19.084456 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.084250 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b7xs\" (UniqueName: \"kubernetes.io/projected/726fce48-e508-45d1-9941-45968ca0ba56-kube-api-access-8b7xs\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:19.084617 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.084593 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:19.084684 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.084606 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:19.095598 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.095570 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b7xs\" (UniqueName: \"kubernetes.io/projected/726fce48-e508-45d1-9941-45968ca0ba56-kube-api-access-8b7xs\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:19.143244 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.143209 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:19.261168 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.261136 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb"] Apr 20 14:59:19.263695 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:59:19.263665 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod726fce48_e508_45d1_9941_45968ca0ba56.slice/crio-58ce3c7ef76e16085fb4d142a63f223dcb1f630546decd74f211f39edafe0089 WatchSource:0}: Error finding container 58ce3c7ef76e16085fb4d142a63f223dcb1f630546decd74f211f39edafe0089: Status 404 returned error can't find the container with id 58ce3c7ef76e16085fb4d142a63f223dcb1f630546decd74f211f39edafe0089 Apr 20 14:59:19.912439 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.912405 2575 generic.go:358] "Generic (PLEG): container finished" podID="726fce48-e508-45d1-9941-45968ca0ba56" containerID="9a7d6aa3824489d14a1d605f5772904695b4b6719cef56e926fa41be79c0fe37" exitCode=0 Apr 20 14:59:19.912685 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.912446 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" event={"ID":"726fce48-e508-45d1-9941-45968ca0ba56","Type":"ContainerDied","Data":"9a7d6aa3824489d14a1d605f5772904695b4b6719cef56e926fa41be79c0fe37"} Apr 20 14:59:19.912685 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:19.912470 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" event={"ID":"726fce48-e508-45d1-9941-45968ca0ba56","Type":"ContainerStarted","Data":"58ce3c7ef76e16085fb4d142a63f223dcb1f630546decd74f211f39edafe0089"} Apr 20 14:59:22.168857 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.168778 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-bmvsz"] Apr 20 14:59:22.172734 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.172717 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" Apr 20 14:59:22.175036 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.174993 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 14:59:22.175991 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.175972 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-cq4t2\"" Apr 20 14:59:22.175991 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.175986 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 14:59:22.178334 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.178314 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-bmvsz"] Apr 20 14:59:22.309448 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.309416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c39b75c-c7e6-4683-9b9d-1f78fa3653f7-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-bmvsz\" (UID: \"3c39b75c-c7e6-4683-9b9d-1f78fa3653f7\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" Apr 20 14:59:22.309608 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.309468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b2k9\" (UniqueName: \"kubernetes.io/projected/3c39b75c-c7e6-4683-9b9d-1f78fa3653f7-kube-api-access-2b2k9\") pod \"cert-manager-cainjector-8966b78d4-bmvsz\" (UID: \"3c39b75c-c7e6-4683-9b9d-1f78fa3653f7\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" Apr 20 14:59:22.410467 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.410436 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c39b75c-c7e6-4683-9b9d-1f78fa3653f7-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-bmvsz\" (UID: \"3c39b75c-c7e6-4683-9b9d-1f78fa3653f7\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" Apr 20 14:59:22.410591 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.410496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b2k9\" (UniqueName: \"kubernetes.io/projected/3c39b75c-c7e6-4683-9b9d-1f78fa3653f7-kube-api-access-2b2k9\") pod \"cert-manager-cainjector-8966b78d4-bmvsz\" (UID: \"3c39b75c-c7e6-4683-9b9d-1f78fa3653f7\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" Apr 20 14:59:22.418723 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.418695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c39b75c-c7e6-4683-9b9d-1f78fa3653f7-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-bmvsz\" (UID: \"3c39b75c-c7e6-4683-9b9d-1f78fa3653f7\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" Apr 20 14:59:22.418882 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.418844 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b2k9\" (UniqueName: \"kubernetes.io/projected/3c39b75c-c7e6-4683-9b9d-1f78fa3653f7-kube-api-access-2b2k9\") pod \"cert-manager-cainjector-8966b78d4-bmvsz\" (UID: \"3c39b75c-c7e6-4683-9b9d-1f78fa3653f7\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" Apr 20 14:59:22.490766 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.490739 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" Apr 20 14:59:22.620254 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.620231 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-bmvsz"] Apr 20 14:59:22.622054 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:59:22.622003 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c39b75c_c7e6_4683_9b9d_1f78fa3653f7.slice/crio-c7da572c30c2faa75b355e75e392d7ea70795c5a5641b8bb089b928f4fdf17c3 WatchSource:0}: Error finding container c7da572c30c2faa75b355e75e392d7ea70795c5a5641b8bb089b928f4fdf17c3: Status 404 returned error can't find the container with id c7da572c30c2faa75b355e75e392d7ea70795c5a5641b8bb089b928f4fdf17c3 Apr 20 14:59:22.926919 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.926879 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" event={"ID":"3c39b75c-c7e6-4683-9b9d-1f78fa3653f7","Type":"ContainerStarted","Data":"c7da572c30c2faa75b355e75e392d7ea70795c5a5641b8bb089b928f4fdf17c3"} Apr 20 14:59:22.928339 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.928316 2575 generic.go:358] "Generic (PLEG): container finished" podID="726fce48-e508-45d1-9941-45968ca0ba56" containerID="318ed7f7bd77c855362b4340581c8acf3380f1c1392bdbcf95631cd8da67f732" exitCode=0 Apr 20 14:59:22.928411 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:22.928364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" event={"ID":"726fce48-e508-45d1-9941-45968ca0ba56","Type":"ContainerDied","Data":"318ed7f7bd77c855362b4340581c8acf3380f1c1392bdbcf95631cd8da67f732"} Apr 20 14:59:23.934546 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:23.934488 2575 generic.go:358] "Generic (PLEG): container finished" podID="726fce48-e508-45d1-9941-45968ca0ba56" containerID="987587c59cb0315cffbf98a9209721b2521876dd27f93a03a8b3ad0d78bf3437" exitCode=0 Apr 20 14:59:23.935041 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:23.934586 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" event={"ID":"726fce48-e508-45d1-9941-45968ca0ba56","Type":"ContainerDied","Data":"987587c59cb0315cffbf98a9209721b2521876dd27f93a03a8b3ad0d78bf3437"} Apr 20 14:59:25.282898 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.282875 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:25.439562 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.439528 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-util\") pod \"726fce48-e508-45d1-9941-45968ca0ba56\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " Apr 20 14:59:25.439562 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.439563 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-bundle\") pod \"726fce48-e508-45d1-9941-45968ca0ba56\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " Apr 20 14:59:25.439780 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.439592 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b7xs\" (UniqueName: \"kubernetes.io/projected/726fce48-e508-45d1-9941-45968ca0ba56-kube-api-access-8b7xs\") pod \"726fce48-e508-45d1-9941-45968ca0ba56\" (UID: \"726fce48-e508-45d1-9941-45968ca0ba56\") " Apr 20 14:59:25.439950 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.439917 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-bundle" (OuterVolumeSpecName: "bundle") pod "726fce48-e508-45d1-9941-45968ca0ba56" (UID: "726fce48-e508-45d1-9941-45968ca0ba56"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:59:25.441953 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.441924 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/726fce48-e508-45d1-9941-45968ca0ba56-kube-api-access-8b7xs" (OuterVolumeSpecName: "kube-api-access-8b7xs") pod "726fce48-e508-45d1-9941-45968ca0ba56" (UID: "726fce48-e508-45d1-9941-45968ca0ba56"). InnerVolumeSpecName "kube-api-access-8b7xs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:59:25.445509 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.445480 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-util" (OuterVolumeSpecName: "util") pod "726fce48-e508-45d1-9941-45968ca0ba56" (UID: "726fce48-e508-45d1-9941-45968ca0ba56"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:59:25.540719 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.540690 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8b7xs\" (UniqueName: \"kubernetes.io/projected/726fce48-e508-45d1-9941-45968ca0ba56-kube-api-access-8b7xs\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:25.540719 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.540718 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:25.540930 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.540728 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/726fce48-e508-45d1-9941-45968ca0ba56-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:25.942115 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.942080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" event={"ID":"3c39b75c-c7e6-4683-9b9d-1f78fa3653f7","Type":"ContainerStarted","Data":"ded0e7e6453ff06f2ac50ab68c0d74eb163b184f9332f3a70f18027647133a3e"} Apr 20 14:59:25.943772 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.943749 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" event={"ID":"726fce48-e508-45d1-9941-45968ca0ba56","Type":"ContainerDied","Data":"58ce3c7ef76e16085fb4d142a63f223dcb1f630546decd74f211f39edafe0089"} Apr 20 14:59:25.943874 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.943777 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58ce3c7ef76e16085fb4d142a63f223dcb1f630546decd74f211f39edafe0089" Apr 20 14:59:25.943874 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.943784 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f9vwtb" Apr 20 14:59:25.958322 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:25.958282 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-bmvsz" podStartSLOduration=1.2432723 podStartE2EDuration="3.958265868s" podCreationTimestamp="2026-04-20 14:59:22 +0000 UTC" firstStartedPulling="2026-04-20 14:59:22.623823677 +0000 UTC m=+334.316367157" lastFinishedPulling="2026-04-20 14:59:25.338817248 +0000 UTC m=+337.031360725" observedRunningTime="2026-04-20 14:59:25.956369488 +0000 UTC m=+337.648912987" watchObservedRunningTime="2026-04-20 14:59:25.958265868 +0000 UTC m=+337.650809367" Apr 20 14:59:39.198646 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.198613 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7"] Apr 20 14:59:39.199116 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.198951 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="726fce48-e508-45d1-9941-45968ca0ba56" containerName="pull" Apr 20 14:59:39.199116 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.198963 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="726fce48-e508-45d1-9941-45968ca0ba56" containerName="pull" Apr 20 14:59:39.199116 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.198972 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="726fce48-e508-45d1-9941-45968ca0ba56" containerName="extract" Apr 20 14:59:39.199116 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.198978 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="726fce48-e508-45d1-9941-45968ca0ba56" containerName="extract" Apr 20 14:59:39.199116 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.198988 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="726fce48-e508-45d1-9941-45968ca0ba56" containerName="util" Apr 20 14:59:39.199116 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.198994 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="726fce48-e508-45d1-9941-45968ca0ba56" containerName="util" Apr 20 14:59:39.199116 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.199074 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="726fce48-e508-45d1-9941-45968ca0ba56" containerName="extract" Apr 20 14:59:39.202626 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.202608 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.204921 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.204897 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:59:39.205108 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.204899 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:59:39.205108 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.205007 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ckg7g\"" Apr 20 14:59:39.208876 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.208856 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7"] Apr 20 14:59:39.250440 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.250415 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnvbr\" (UniqueName: \"kubernetes.io/projected/22717d19-43e1-4726-8f7f-1a238ec1cf4a-kube-api-access-lnvbr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.250579 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.250448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.250579 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.250482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.351210 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.351181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.351346 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.351236 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.351346 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.351294 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnvbr\" (UniqueName: \"kubernetes.io/projected/22717d19-43e1-4726-8f7f-1a238ec1cf4a-kube-api-access-lnvbr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.351635 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.351611 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.351678 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.351643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.358649 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.358619 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnvbr\" (UniqueName: \"kubernetes.io/projected/22717d19-43e1-4726-8f7f-1a238ec1cf4a-kube-api-access-lnvbr\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.512035 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.511939 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:39.634052 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.634010 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7"] Apr 20 14:59:39.635969 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:59:39.635934 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22717d19_43e1_4726_8f7f_1a238ec1cf4a.slice/crio-d1ab78b92b9de07c9d81672d781d1f159d086c5a7586f25e26fd059d02568921 WatchSource:0}: Error finding container d1ab78b92b9de07c9d81672d781d1f159d086c5a7586f25e26fd059d02568921: Status 404 returned error can't find the container with id d1ab78b92b9de07c9d81672d781d1f159d086c5a7586f25e26fd059d02568921 Apr 20 14:59:39.991197 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.991165 2575 generic.go:358] "Generic (PLEG): container finished" podID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerID="bf9b800397dd63f0233e1f941d3f2fc1f3274501827b359d22ca3684552c837a" exitCode=0 Apr 20 14:59:39.991414 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.991261 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" event={"ID":"22717d19-43e1-4726-8f7f-1a238ec1cf4a","Type":"ContainerDied","Data":"bf9b800397dd63f0233e1f941d3f2fc1f3274501827b359d22ca3684552c837a"} Apr 20 14:59:39.991414 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:39.991292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" event={"ID":"22717d19-43e1-4726-8f7f-1a238ec1cf4a","Type":"ContainerStarted","Data":"d1ab78b92b9de07c9d81672d781d1f159d086c5a7586f25e26fd059d02568921"} Apr 20 14:59:40.995860 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:40.995784 2575 generic.go:358] "Generic (PLEG): container finished" podID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerID="0f1dc1d0cc94d8d7e6a0262dcf26ffafe5b9f936ed1733fac287ed334b9e84a9" exitCode=0 Apr 20 14:59:40.996255 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:40.995856 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" event={"ID":"22717d19-43e1-4726-8f7f-1a238ec1cf4a","Type":"ContainerDied","Data":"0f1dc1d0cc94d8d7e6a0262dcf26ffafe5b9f936ed1733fac287ed334b9e84a9"} Apr 20 14:59:42.000963 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:42.000930 2575 generic.go:358] "Generic (PLEG): container finished" podID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerID="38a83714f9a3763da3f2b8b53052d29e5f59430c8a252c6b12b6cb4a385664ce" exitCode=0 Apr 20 14:59:42.001355 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:42.000991 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" event={"ID":"22717d19-43e1-4726-8f7f-1a238ec1cf4a","Type":"ContainerDied","Data":"38a83714f9a3763da3f2b8b53052d29e5f59430c8a252c6b12b6cb4a385664ce"} Apr 20 14:59:43.141832 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.141808 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:43.177599 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.177567 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-util\") pod \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " Apr 20 14:59:43.177754 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.177629 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-bundle\") pod \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " Apr 20 14:59:43.177754 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.177691 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnvbr\" (UniqueName: \"kubernetes.io/projected/22717d19-43e1-4726-8f7f-1a238ec1cf4a-kube-api-access-lnvbr\") pod \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\" (UID: \"22717d19-43e1-4726-8f7f-1a238ec1cf4a\") " Apr 20 14:59:43.178536 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.178499 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-bundle" (OuterVolumeSpecName: "bundle") pod "22717d19-43e1-4726-8f7f-1a238ec1cf4a" (UID: "22717d19-43e1-4726-8f7f-1a238ec1cf4a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:59:43.179876 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.179846 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22717d19-43e1-4726-8f7f-1a238ec1cf4a-kube-api-access-lnvbr" (OuterVolumeSpecName: "kube-api-access-lnvbr") pod "22717d19-43e1-4726-8f7f-1a238ec1cf4a" (UID: "22717d19-43e1-4726-8f7f-1a238ec1cf4a"). InnerVolumeSpecName "kube-api-access-lnvbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:59:43.182983 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.182958 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-util" (OuterVolumeSpecName: "util") pod "22717d19-43e1-4726-8f7f-1a238ec1cf4a" (UID: "22717d19-43e1-4726-8f7f-1a238ec1cf4a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:59:43.278937 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.278911 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lnvbr\" (UniqueName: \"kubernetes.io/projected/22717d19-43e1-4726-8f7f-1a238ec1cf4a-kube-api-access-lnvbr\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:43.278937 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.278937 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:43.279102 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:43.278947 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/22717d19-43e1-4726-8f7f-1a238ec1cf4a-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:44.010344 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:44.010317 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" Apr 20 14:59:44.010550 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:44.010305 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5476c7" event={"ID":"22717d19-43e1-4726-8f7f-1a238ec1cf4a","Type":"ContainerDied","Data":"d1ab78b92b9de07c9d81672d781d1f159d086c5a7586f25e26fd059d02568921"} Apr 20 14:59:44.010550 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:44.010428 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ab78b92b9de07c9d81672d781d1f159d086c5a7586f25e26fd059d02568921" Apr 20 14:59:49.615960 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.615915 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848"] Apr 20 14:59:49.616484 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.616463 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerName="util" Apr 20 14:59:49.616560 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.616488 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerName="util" Apr 20 14:59:49.616560 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.616500 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerName="pull" Apr 20 14:59:49.616560 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.616508 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerName="pull" Apr 20 14:59:49.616560 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.616520 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerName="extract" Apr 20 14:59:49.616560 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.616529 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerName="extract" Apr 20 14:59:49.616800 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.616635 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="22717d19-43e1-4726-8f7f-1a238ec1cf4a" containerName="extract" Apr 20 14:59:49.620856 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.620833 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.623399 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.623374 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 14:59:49.623571 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.623439 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 14:59:49.624596 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.624557 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ckg7g\"" Apr 20 14:59:49.626554 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.626527 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848"] Apr 20 14:59:49.728856 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.728826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.728856 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.728856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.729084 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.728883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqjgh\" (UniqueName: \"kubernetes.io/projected/3d8105ce-4432-41b8-9505-ff28df6e03c3-kube-api-access-kqjgh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.829756 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.829727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.829756 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.829759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.829952 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.829806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqjgh\" (UniqueName: \"kubernetes.io/projected/3d8105ce-4432-41b8-9505-ff28df6e03c3-kube-api-access-kqjgh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.830209 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.830189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.830254 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.830220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.841132 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.841106 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqjgh\" (UniqueName: \"kubernetes.io/projected/3d8105ce-4432-41b8-9505-ff28df6e03c3-kube-api-access-kqjgh\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:49.930429 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:49.930345 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:50.051947 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:50.051926 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848"] Apr 20 14:59:50.054208 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:59:50.054177 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d8105ce_4432_41b8_9505_ff28df6e03c3.slice/crio-8fb56f26976364d1e260f2e9b74c95f73bf8bb5be94535f47a26419835e43629 WatchSource:0}: Error finding container 8fb56f26976364d1e260f2e9b74c95f73bf8bb5be94535f47a26419835e43629: Status 404 returned error can't find the container with id 8fb56f26976364d1e260f2e9b74c95f73bf8bb5be94535f47a26419835e43629 Apr 20 14:59:51.036739 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.036709 2575 generic.go:358] "Generic (PLEG): container finished" podID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerID="5eaad99d4237ec67e7655404dd59cc6934fea45d6a0d9a681efeae17f81775d6" exitCode=0 Apr 20 14:59:51.037153 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.036761 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" event={"ID":"3d8105ce-4432-41b8-9505-ff28df6e03c3","Type":"ContainerDied","Data":"5eaad99d4237ec67e7655404dd59cc6934fea45d6a0d9a681efeae17f81775d6"} Apr 20 14:59:51.037153 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.036782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" event={"ID":"3d8105ce-4432-41b8-9505-ff28df6e03c3","Type":"ContainerStarted","Data":"8fb56f26976364d1e260f2e9b74c95f73bf8bb5be94535f47a26419835e43629"} Apr 20 14:59:51.080117 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.080082 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22"] Apr 20 14:59:51.083406 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.083377 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.085886 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.085860 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 14:59:51.085886 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.085874 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 14:59:51.086071 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.085887 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 14:59:51.086071 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.085875 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-525vc\"" Apr 20 14:59:51.086188 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.086145 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 14:59:51.098763 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.098712 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22"] Apr 20 14:59:51.139781 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.139750 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f6d266c-54c0-444c-bc4f-a73bfe2997b7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-g6m22\" (UID: \"7f6d266c-54c0-444c-bc4f-a73bfe2997b7\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.139906 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.139865 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg72b\" (UniqueName: \"kubernetes.io/projected/7f6d266c-54c0-444c-bc4f-a73bfe2997b7-kube-api-access-pg72b\") pod \"opendatahub-operator-controller-manager-854569cf8c-g6m22\" (UID: \"7f6d266c-54c0-444c-bc4f-a73bfe2997b7\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.139983 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.139908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f6d266c-54c0-444c-bc4f-a73bfe2997b7-webhook-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-g6m22\" (UID: \"7f6d266c-54c0-444c-bc4f-a73bfe2997b7\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.240369 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.240330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg72b\" (UniqueName: \"kubernetes.io/projected/7f6d266c-54c0-444c-bc4f-a73bfe2997b7-kube-api-access-pg72b\") pod \"opendatahub-operator-controller-manager-854569cf8c-g6m22\" (UID: \"7f6d266c-54c0-444c-bc4f-a73bfe2997b7\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.240369 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.240369 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f6d266c-54c0-444c-bc4f-a73bfe2997b7-webhook-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-g6m22\" (UID: \"7f6d266c-54c0-444c-bc4f-a73bfe2997b7\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.240574 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.240406 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f6d266c-54c0-444c-bc4f-a73bfe2997b7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-g6m22\" (UID: \"7f6d266c-54c0-444c-bc4f-a73bfe2997b7\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.243032 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.242994 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7f6d266c-54c0-444c-bc4f-a73bfe2997b7-webhook-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-g6m22\" (UID: \"7f6d266c-54c0-444c-bc4f-a73bfe2997b7\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.243144 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.243054 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7f6d266c-54c0-444c-bc4f-a73bfe2997b7-apiservice-cert\") pod \"opendatahub-operator-controller-manager-854569cf8c-g6m22\" (UID: \"7f6d266c-54c0-444c-bc4f-a73bfe2997b7\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.248053 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.248012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg72b\" (UniqueName: \"kubernetes.io/projected/7f6d266c-54c0-444c-bc4f-a73bfe2997b7-kube-api-access-pg72b\") pod \"opendatahub-operator-controller-manager-854569cf8c-g6m22\" (UID: \"7f6d266c-54c0-444c-bc4f-a73bfe2997b7\") " pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.394111 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.394008 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:51.538126 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:51.538100 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22"] Apr 20 14:59:51.539776 ip-10-0-140-93 kubenswrapper[2575]: W0420 14:59:51.539751 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f6d266c_54c0_444c_bc4f_a73bfe2997b7.slice/crio-235d18e673ff455bd7440e6bfaf7f5b400b1f413ec5436c434b19c7c38f1ed02 WatchSource:0}: Error finding container 235d18e673ff455bd7440e6bfaf7f5b400b1f413ec5436c434b19c7c38f1ed02: Status 404 returned error can't find the container with id 235d18e673ff455bd7440e6bfaf7f5b400b1f413ec5436c434b19c7c38f1ed02 Apr 20 14:59:52.042399 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:52.042366 2575 generic.go:358] "Generic (PLEG): container finished" podID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerID="8189c59194e28194af9e84a7a828a21bea16e000d1916a20625ef0dae40e9650" exitCode=0 Apr 20 14:59:52.042890 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:52.042459 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" event={"ID":"3d8105ce-4432-41b8-9505-ff28df6e03c3","Type":"ContainerDied","Data":"8189c59194e28194af9e84a7a828a21bea16e000d1916a20625ef0dae40e9650"} Apr 20 14:59:52.044003 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:52.043973 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" event={"ID":"7f6d266c-54c0-444c-bc4f-a73bfe2997b7","Type":"ContainerStarted","Data":"235d18e673ff455bd7440e6bfaf7f5b400b1f413ec5436c434b19c7c38f1ed02"} Apr 20 14:59:53.050792 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:53.050756 2575 generic.go:358] "Generic (PLEG): container finished" podID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerID="ef36952e9351e6c39fe5c3adb7d9c68b9068dc41e9d93aca1f3a27b5f9ddc95c" exitCode=0 Apr 20 14:59:53.051252 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:53.050853 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" event={"ID":"3d8105ce-4432-41b8-9505-ff28df6e03c3","Type":"ContainerDied","Data":"ef36952e9351e6c39fe5c3adb7d9c68b9068dc41e9d93aca1f3a27b5f9ddc95c"} Apr 20 14:59:54.178232 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.178205 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:54.265410 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.265376 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-util\") pod \"3d8105ce-4432-41b8-9505-ff28df6e03c3\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " Apr 20 14:59:54.265564 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.265498 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-bundle\") pod \"3d8105ce-4432-41b8-9505-ff28df6e03c3\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " Apr 20 14:59:54.265564 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.265537 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqjgh\" (UniqueName: \"kubernetes.io/projected/3d8105ce-4432-41b8-9505-ff28df6e03c3-kube-api-access-kqjgh\") pod \"3d8105ce-4432-41b8-9505-ff28df6e03c3\" (UID: \"3d8105ce-4432-41b8-9505-ff28df6e03c3\") " Apr 20 14:59:54.266326 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.266253 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-bundle" (OuterVolumeSpecName: "bundle") pod "3d8105ce-4432-41b8-9505-ff28df6e03c3" (UID: "3d8105ce-4432-41b8-9505-ff28df6e03c3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:59:54.267731 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.267705 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8105ce-4432-41b8-9505-ff28df6e03c3-kube-api-access-kqjgh" (OuterVolumeSpecName: "kube-api-access-kqjgh") pod "3d8105ce-4432-41b8-9505-ff28df6e03c3" (UID: "3d8105ce-4432-41b8-9505-ff28df6e03c3"). InnerVolumeSpecName "kube-api-access-kqjgh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:59:54.270821 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.270785 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-util" (OuterVolumeSpecName: "util") pod "3d8105ce-4432-41b8-9505-ff28df6e03c3" (UID: "3d8105ce-4432-41b8-9505-ff28df6e03c3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:59:54.366632 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.366605 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:54.366632 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.366629 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kqjgh\" (UniqueName: \"kubernetes.io/projected/3d8105ce-4432-41b8-9505-ff28df6e03c3-kube-api-access-kqjgh\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:54.366796 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:54.366640 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8105ce-4432-41b8-9505-ff28df6e03c3-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 14:59:55.060742 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:55.060713 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" Apr 20 14:59:55.060916 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:55.060714 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9qc848" event={"ID":"3d8105ce-4432-41b8-9505-ff28df6e03c3","Type":"ContainerDied","Data":"8fb56f26976364d1e260f2e9b74c95f73bf8bb5be94535f47a26419835e43629"} Apr 20 14:59:55.060916 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:55.060828 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fb56f26976364d1e260f2e9b74c95f73bf8bb5be94535f47a26419835e43629" Apr 20 14:59:55.062331 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:55.062306 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" event={"ID":"7f6d266c-54c0-444c-bc4f-a73bfe2997b7","Type":"ContainerStarted","Data":"8aa7e0d8e00c042b40ac34ecd7199ae4df6c1e4c29bcaed0ad7ed65d75ddca82"} Apr 20 14:59:55.062488 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:55.062471 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 14:59:55.082914 ip-10-0-140-93 kubenswrapper[2575]: I0420 14:59:55.082869 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" podStartSLOduration=1.597954785 podStartE2EDuration="4.082858241s" podCreationTimestamp="2026-04-20 14:59:51 +0000 UTC" firstStartedPulling="2026-04-20 14:59:51.541486576 +0000 UTC m=+363.234030054" lastFinishedPulling="2026-04-20 14:59:54.026390034 +0000 UTC m=+365.718933510" observedRunningTime="2026-04-20 14:59:55.081301143 +0000 UTC m=+366.773844639" watchObservedRunningTime="2026-04-20 14:59:55.082858241 +0000 UTC m=+366.775401739" Apr 20 15:00:04.376363 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.376324 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx"] Apr 20 15:00:04.376719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.376650 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerName="util" Apr 20 15:00:04.376719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.376660 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerName="util" Apr 20 15:00:04.376719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.376672 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerName="extract" Apr 20 15:00:04.376719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.376678 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerName="extract" Apr 20 15:00:04.376719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.376692 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerName="pull" Apr 20 15:00:04.376719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.376698 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerName="pull" Apr 20 15:00:04.376899 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.376747 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="3d8105ce-4432-41b8-9505-ff28df6e03c3" containerName="extract" Apr 20 15:00:04.379516 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.379501 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.383041 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.382992 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 15:00:04.383041 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.383014 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 15:00:04.383253 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.383090 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 15:00:04.383253 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.383104 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lj4z8\"" Apr 20 15:00:04.383253 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.383132 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 15:00:04.383253 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.383132 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:00:04.388488 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.388466 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx"] Apr 20 15:00:04.445098 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.445063 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22b1eef0-9570-4084-861a-bee77a3ad8ef-cert\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.445265 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.445112 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/22b1eef0-9570-4084-861a-bee77a3ad8ef-manager-config\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.445265 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.445134 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl8v8\" (UniqueName: \"kubernetes.io/projected/22b1eef0-9570-4084-861a-bee77a3ad8ef-kube-api-access-pl8v8\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.445265 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.445169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/22b1eef0-9570-4084-861a-bee77a3ad8ef-metrics-cert\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.546398 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.546365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22b1eef0-9570-4084-861a-bee77a3ad8ef-cert\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.546574 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.546426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/22b1eef0-9570-4084-861a-bee77a3ad8ef-manager-config\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.546574 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.546454 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl8v8\" (UniqueName: \"kubernetes.io/projected/22b1eef0-9570-4084-861a-bee77a3ad8ef-kube-api-access-pl8v8\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.546574 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.546479 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/22b1eef0-9570-4084-861a-bee77a3ad8ef-metrics-cert\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.547248 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.547219 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/22b1eef0-9570-4084-861a-bee77a3ad8ef-manager-config\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.549242 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.549212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/22b1eef0-9570-4084-861a-bee77a3ad8ef-metrics-cert\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.549363 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.549244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22b1eef0-9570-4084-861a-bee77a3ad8ef-cert\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.558929 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.558904 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl8v8\" (UniqueName: \"kubernetes.io/projected/22b1eef0-9570-4084-861a-bee77a3ad8ef-kube-api-access-pl8v8\") pod \"lws-controller-manager-54f6c466b9-w6shx\" (UID: \"22b1eef0-9570-4084-861a-bee77a3ad8ef\") " pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.689733 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.689635 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:04.815676 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:04.815647 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx"] Apr 20 15:00:04.817644 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:00:04.817616 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22b1eef0_9570_4084_861a_bee77a3ad8ef.slice/crio-8c4b9387ddc293fe1cef3758255105666bbcd1a3d15b0e155bccd259d3da9ee7 WatchSource:0}: Error finding container 8c4b9387ddc293fe1cef3758255105666bbcd1a3d15b0e155bccd259d3da9ee7: Status 404 returned error can't find the container with id 8c4b9387ddc293fe1cef3758255105666bbcd1a3d15b0e155bccd259d3da9ee7 Apr 20 15:00:05.102585 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:05.102546 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" event={"ID":"22b1eef0-9570-4084-861a-bee77a3ad8ef","Type":"ContainerStarted","Data":"8c4b9387ddc293fe1cef3758255105666bbcd1a3d15b0e155bccd259d3da9ee7"} Apr 20 15:00:06.067093 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:06.067055 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-854569cf8c-g6m22" Apr 20 15:00:08.282371 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.282341 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725"] Apr 20 15:00:08.287351 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.287329 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.290149 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.290127 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ckg7g\"" Apr 20 15:00:08.290370 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.290351 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:00:08.291108 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.291088 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:00:08.294380 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.294348 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725"] Apr 20 15:00:08.384494 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.384456 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k88l\" (UniqueName: \"kubernetes.io/projected/1baf4afd-b2f3-41e5-b470-da8373cdbfab-kube-api-access-5k88l\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.384645 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.384554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.384645 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.384640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.485671 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.485638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.485829 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.485672 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k88l\" (UniqueName: \"kubernetes.io/projected/1baf4afd-b2f3-41e5-b470-da8373cdbfab-kube-api-access-5k88l\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.485829 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.485809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.486135 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.486114 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.486219 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.486140 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.493735 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.493714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k88l\" (UniqueName: \"kubernetes.io/projected/1baf4afd-b2f3-41e5-b470-da8373cdbfab-kube-api-access-5k88l\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.597990 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.597922 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:08.720014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:08.719987 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725"] Apr 20 15:00:08.722417 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:00:08.722392 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1baf4afd_b2f3_41e5_b470_da8373cdbfab.slice/crio-d3c2cdf801e871d6a29e2b49332d1d26889c1719a42f9c6e1373026332361495 WatchSource:0}: Error finding container d3c2cdf801e871d6a29e2b49332d1d26889c1719a42f9c6e1373026332361495: Status 404 returned error can't find the container with id d3c2cdf801e871d6a29e2b49332d1d26889c1719a42f9c6e1373026332361495 Apr 20 15:00:09.117563 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:09.117525 2575 generic.go:358] "Generic (PLEG): container finished" podID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerID="24c617a815a3f9db829b24bd9f418e008383a936c8e18f7e29aa4b4f025f8528" exitCode=0 Apr 20 15:00:09.117758 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:09.117620 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" event={"ID":"1baf4afd-b2f3-41e5-b470-da8373cdbfab","Type":"ContainerDied","Data":"24c617a815a3f9db829b24bd9f418e008383a936c8e18f7e29aa4b4f025f8528"} Apr 20 15:00:09.117758 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:09.117661 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" event={"ID":"1baf4afd-b2f3-41e5-b470-da8373cdbfab","Type":"ContainerStarted","Data":"d3c2cdf801e871d6a29e2b49332d1d26889c1719a42f9c6e1373026332361495"} Apr 20 15:00:11.125768 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:11.125725 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" event={"ID":"22b1eef0-9570-4084-861a-bee77a3ad8ef","Type":"ContainerStarted","Data":"42c553afdcfd02ba9295cd34dd1616e1c7913b112e2b0137420d4e2a0919d72c"} Apr 20 15:00:11.126259 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:11.125833 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:11.127410 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:11.127389 2575 generic.go:358] "Generic (PLEG): container finished" podID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerID="a00696d219b01e20eb0e5325b68630f789bb6261437a47a1bec258ba349117cc" exitCode=0 Apr 20 15:00:11.127513 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:11.127417 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" event={"ID":"1baf4afd-b2f3-41e5-b470-da8373cdbfab","Type":"ContainerDied","Data":"a00696d219b01e20eb0e5325b68630f789bb6261437a47a1bec258ba349117cc"} Apr 20 15:00:11.141834 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:11.141795 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" podStartSLOduration=1.5729645840000002 podStartE2EDuration="7.141784314s" podCreationTimestamp="2026-04-20 15:00:04 +0000 UTC" firstStartedPulling="2026-04-20 15:00:04.819461132 +0000 UTC m=+376.512004609" lastFinishedPulling="2026-04-20 15:00:10.388280862 +0000 UTC m=+382.080824339" observedRunningTime="2026-04-20 15:00:11.139437388 +0000 UTC m=+382.831980888" watchObservedRunningTime="2026-04-20 15:00:11.141784314 +0000 UTC m=+382.834327827" Apr 20 15:00:12.132035 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:12.131987 2575 generic.go:358] "Generic (PLEG): container finished" podID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerID="5562627716f23e16a6393b3635fabaf25699a124e7ac6d8c5bd7f56f822acc54" exitCode=0 Apr 20 15:00:12.132388 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:12.132079 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" event={"ID":"1baf4afd-b2f3-41e5-b470-da8373cdbfab","Type":"ContainerDied","Data":"5562627716f23e16a6393b3635fabaf25699a124e7ac6d8c5bd7f56f822acc54"} Apr 20 15:00:13.254096 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.254076 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:13.330068 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.330044 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k88l\" (UniqueName: \"kubernetes.io/projected/1baf4afd-b2f3-41e5-b470-da8373cdbfab-kube-api-access-5k88l\") pod \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " Apr 20 15:00:13.330209 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.330124 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-bundle\") pod \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " Apr 20 15:00:13.330209 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.330176 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-util\") pod \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\" (UID: \"1baf4afd-b2f3-41e5-b470-da8373cdbfab\") " Apr 20 15:00:13.330951 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.330923 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-bundle" (OuterVolumeSpecName: "bundle") pod "1baf4afd-b2f3-41e5-b470-da8373cdbfab" (UID: "1baf4afd-b2f3-41e5-b470-da8373cdbfab"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:00:13.332159 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.332129 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1baf4afd-b2f3-41e5-b470-da8373cdbfab-kube-api-access-5k88l" (OuterVolumeSpecName: "kube-api-access-5k88l") pod "1baf4afd-b2f3-41e5-b470-da8373cdbfab" (UID: "1baf4afd-b2f3-41e5-b470-da8373cdbfab"). InnerVolumeSpecName "kube-api-access-5k88l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:00:13.335836 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.335813 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-util" (OuterVolumeSpecName: "util") pod "1baf4afd-b2f3-41e5-b470-da8373cdbfab" (UID: "1baf4afd-b2f3-41e5-b470-da8373cdbfab"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:00:13.431735 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.431679 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5k88l\" (UniqueName: \"kubernetes.io/projected/1baf4afd-b2f3-41e5-b470-da8373cdbfab-kube-api-access-5k88l\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:00:13.431735 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.431702 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:00:13.431735 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:13.431711 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1baf4afd-b2f3-41e5-b470-da8373cdbfab-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:00:14.140497 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:14.140464 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" event={"ID":"1baf4afd-b2f3-41e5-b470-da8373cdbfab","Type":"ContainerDied","Data":"d3c2cdf801e871d6a29e2b49332d1d26889c1719a42f9c6e1373026332361495"} Apr 20 15:00:14.140497 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:14.140499 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3c2cdf801e871d6a29e2b49332d1d26889c1719a42f9c6e1373026332361495" Apr 20 15:00:14.140695 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:14.140480 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c48356h725" Apr 20 15:00:22.134544 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.134510 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-54f6c466b9-w6shx" Apr 20 15:00:22.419080 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.418975 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5"] Apr 20 15:00:22.419342 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.419327 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerName="util" Apr 20 15:00:22.419386 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.419343 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerName="util" Apr 20 15:00:22.419386 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.419361 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerName="pull" Apr 20 15:00:22.419386 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.419367 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerName="pull" Apr 20 15:00:22.419386 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.419373 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerName="extract" Apr 20 15:00:22.419386 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.419379 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerName="extract" Apr 20 15:00:22.419527 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.419429 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1baf4afd-b2f3-41e5-b470-da8373cdbfab" containerName="extract" Apr 20 15:00:22.423896 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.423880 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.435874 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.435849 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:00:22.436716 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.436696 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ckg7g\"" Apr 20 15:00:22.436716 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.436711 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:00:22.439698 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.439675 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5"] Apr 20 15:00:22.514374 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.514345 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.514545 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.514391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzwd4\" (UniqueName: \"kubernetes.io/projected/87c38551-5119-4ce3-9b4a-366f31c2f8f3-kube-api-access-mzwd4\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.514545 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.514479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.615599 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.615569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.615773 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.615613 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzwd4\" (UniqueName: \"kubernetes.io/projected/87c38551-5119-4ce3-9b4a-366f31c2f8f3-kube-api-access-mzwd4\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.615773 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.615660 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.615928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.615909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.615967 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.615941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.637767 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.637736 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzwd4\" (UniqueName: \"kubernetes.io/projected/87c38551-5119-4ce3-9b4a-366f31c2f8f3-kube-api-access-mzwd4\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.733063 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.732971 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:22.883857 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:22.883825 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5"] Apr 20 15:00:22.884997 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:00:22.884969 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c38551_5119_4ce3_9b4a_366f31c2f8f3.slice/crio-d8d690817558be4b398179abee8f1e8d27587c4c735105cb917b111bfe16b024 WatchSource:0}: Error finding container d8d690817558be4b398179abee8f1e8d27587c4c735105cb917b111bfe16b024: Status 404 returned error can't find the container with id d8d690817558be4b398179abee8f1e8d27587c4c735105cb917b111bfe16b024 Apr 20 15:00:23.171258 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:23.171222 2575 generic.go:358] "Generic (PLEG): container finished" podID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerID="b737a78448c507d3f2468a2c5e1fa787d51afa1fd5007bdab39c968f00cc5a64" exitCode=0 Apr 20 15:00:23.171726 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:23.171294 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" event={"ID":"87c38551-5119-4ce3-9b4a-366f31c2f8f3","Type":"ContainerDied","Data":"b737a78448c507d3f2468a2c5e1fa787d51afa1fd5007bdab39c968f00cc5a64"} Apr 20 15:00:23.171726 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:23.171320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" event={"ID":"87c38551-5119-4ce3-9b4a-366f31c2f8f3","Type":"ContainerStarted","Data":"d8d690817558be4b398179abee8f1e8d27587c4c735105cb917b111bfe16b024"} Apr 20 15:00:25.178841 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:25.178808 2575 generic.go:358] "Generic (PLEG): container finished" podID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerID="29cce2d08c1de75e4eab4c5da86e905036327678777e03059efcede8c4e45402" exitCode=0 Apr 20 15:00:25.179273 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:25.178857 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" event={"ID":"87c38551-5119-4ce3-9b4a-366f31c2f8f3","Type":"ContainerDied","Data":"29cce2d08c1de75e4eab4c5da86e905036327678777e03059efcede8c4e45402"} Apr 20 15:00:26.184107 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:26.184003 2575 generic.go:358] "Generic (PLEG): container finished" podID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerID="3dbb352a9236c4f2f4e6f5090d11d500830a1a2e8e42bedb5fe0e19ce75e817b" exitCode=0 Apr 20 15:00:26.184107 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:26.184080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" event={"ID":"87c38551-5119-4ce3-9b4a-366f31c2f8f3","Type":"ContainerDied","Data":"3dbb352a9236c4f2f4e6f5090d11d500830a1a2e8e42bedb5fe0e19ce75e817b"} Apr 20 15:00:27.312101 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.312079 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:27.461877 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.461789 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-util\") pod \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " Apr 20 15:00:27.461877 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.461868 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzwd4\" (UniqueName: \"kubernetes.io/projected/87c38551-5119-4ce3-9b4a-366f31c2f8f3-kube-api-access-mzwd4\") pod \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " Apr 20 15:00:27.462090 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.461940 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-bundle\") pod \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\" (UID: \"87c38551-5119-4ce3-9b4a-366f31c2f8f3\") " Apr 20 15:00:27.462799 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.462777 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-bundle" (OuterVolumeSpecName: "bundle") pod "87c38551-5119-4ce3-9b4a-366f31c2f8f3" (UID: "87c38551-5119-4ce3-9b4a-366f31c2f8f3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:00:27.464215 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.464192 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c38551-5119-4ce3-9b4a-366f31c2f8f3-kube-api-access-mzwd4" (OuterVolumeSpecName: "kube-api-access-mzwd4") pod "87c38551-5119-4ce3-9b4a-366f31c2f8f3" (UID: "87c38551-5119-4ce3-9b4a-366f31c2f8f3"). InnerVolumeSpecName "kube-api-access-mzwd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:00:27.467824 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.467805 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-util" (OuterVolumeSpecName: "util") pod "87c38551-5119-4ce3-9b4a-366f31c2f8f3" (UID: "87c38551-5119-4ce3-9b4a-366f31c2f8f3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:00:27.563463 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.563433 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:00:27.563463 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.563460 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mzwd4\" (UniqueName: \"kubernetes.io/projected/87c38551-5119-4ce3-9b4a-366f31c2f8f3-kube-api-access-mzwd4\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:00:27.563463 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:27.563471 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87c38551-5119-4ce3-9b4a-366f31c2f8f3-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:00:28.193170 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:28.193133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" event={"ID":"87c38551-5119-4ce3-9b4a-366f31c2f8f3","Type":"ContainerDied","Data":"d8d690817558be4b398179abee8f1e8d27587c4c735105cb917b111bfe16b024"} Apr 20 15:00:28.193170 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:28.193160 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2lh5r5" Apr 20 15:00:28.193382 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:28.193167 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8d690817558be4b398179abee8f1e8d27587c4c735105cb917b111bfe16b024" Apr 20 15:00:42.838806 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.838773 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7"] Apr 20 15:00:42.839304 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.839117 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerName="pull" Apr 20 15:00:42.839304 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.839128 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerName="pull" Apr 20 15:00:42.839304 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.839141 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerName="extract" Apr 20 15:00:42.839304 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.839146 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerName="extract" Apr 20 15:00:42.839304 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.839163 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerName="util" Apr 20 15:00:42.839304 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.839170 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerName="util" Apr 20 15:00:42.839304 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.839221 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="87c38551-5119-4ce3-9b4a-366f31c2f8f3" containerName="extract" Apr 20 15:00:42.842342 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.842320 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.844688 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.844666 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 15:00:42.844811 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.844720 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-2frp9\"" Apr 20 15:00:42.851820 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.851798 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7"] Apr 20 15:00:42.882621 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.882598 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f61a3200-3989-4fa9-9836-2a809a33573e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.882750 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.882649 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f61a3200-3989-4fa9-9836-2a809a33573e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.882750 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.882668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9nf9\" (UniqueName: \"kubernetes.io/projected/f61a3200-3989-4fa9-9836-2a809a33573e-kube-api-access-x9nf9\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.882834 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.882758 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.882834 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.882820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.882932 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.882887 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.882932 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.882917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f61a3200-3989-4fa9-9836-2a809a33573e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.883061 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.882983 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.883061 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.883045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984258 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984258 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984262 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984444 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984285 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f61a3200-3989-4fa9-9836-2a809a33573e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984444 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f61a3200-3989-4fa9-9836-2a809a33573e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984523 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9nf9\" (UniqueName: \"kubernetes.io/projected/f61a3200-3989-4fa9-9836-2a809a33573e-kube-api-access-x9nf9\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984523 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984626 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984542 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984626 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984584 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984729 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984628 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984729 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f61a3200-3989-4fa9-9836-2a809a33573e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.984857 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.984835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.985196 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.985136 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.985196 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.985172 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f61a3200-3989-4fa9-9836-2a809a33573e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.985196 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.985238 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.986766 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.986740 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f61a3200-3989-4fa9-9836-2a809a33573e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.986950 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.986931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f61a3200-3989-4fa9-9836-2a809a33573e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.991404 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.991389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f61a3200-3989-4fa9-9836-2a809a33573e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:42.991839 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:42.991819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9nf9\" (UniqueName: \"kubernetes.io/projected/f61a3200-3989-4fa9-9836-2a809a33573e-kube-api-access-x9nf9\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fxphz7\" (UID: \"f61a3200-3989-4fa9-9836-2a809a33573e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:43.154747 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:43.154664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:43.286948 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:43.286919 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7"] Apr 20 15:00:43.288177 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:00:43.288152 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf61a3200_3989_4fa9_9836_2a809a33573e.slice/crio-7267379a658a1b543155ddd9acea684d67f5ee24a835ac3fb6989fa9e40e3c5d WatchSource:0}: Error finding container 7267379a658a1b543155ddd9acea684d67f5ee24a835ac3fb6989fa9e40e3c5d: Status 404 returned error can't find the container with id 7267379a658a1b543155ddd9acea684d67f5ee24a835ac3fb6989fa9e40e3c5d Apr 20 15:00:44.251836 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:44.251793 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" event={"ID":"f61a3200-3989-4fa9-9836-2a809a33573e","Type":"ContainerStarted","Data":"7267379a658a1b543155ddd9acea684d67f5ee24a835ac3fb6989fa9e40e3c5d"} Apr 20 15:00:45.870627 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:45.870592 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 15:00:45.870904 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:45.870669 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 15:00:45.870904 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:45.870696 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 15:00:46.263786 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:46.263701 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" event={"ID":"f61a3200-3989-4fa9-9836-2a809a33573e","Type":"ContainerStarted","Data":"e2bd76cfda6e61907bbd1f124dada23e7abee5039b5a61d9551789f4a707306b"} Apr 20 15:00:46.283871 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:46.283820 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" podStartSLOduration=1.7037040970000001 podStartE2EDuration="4.283805241s" podCreationTimestamp="2026-04-20 15:00:42 +0000 UTC" firstStartedPulling="2026-04-20 15:00:43.290244997 +0000 UTC m=+414.982788486" lastFinishedPulling="2026-04-20 15:00:45.870346151 +0000 UTC m=+417.562889630" observedRunningTime="2026-04-20 15:00:46.281610656 +0000 UTC m=+417.974154154" watchObservedRunningTime="2026-04-20 15:00:46.283805241 +0000 UTC m=+417.976348742" Apr 20 15:00:47.155719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:47.155680 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:47.160503 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:47.160479 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:47.267355 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:47.267316 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:00:47.268261 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:00:47.268245 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fxphz7" Apr 20 15:01:09.381370 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.381283 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ttg4v"] Apr 20 15:01:09.383685 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.383663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" Apr 20 15:01:09.385928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.385908 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:01:09.386004 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.385908 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-qwc9r\"" Apr 20 15:01:09.386785 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.386767 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:01:09.390529 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.390499 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ttg4v"] Apr 20 15:01:09.511773 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.511743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2jmg\" (UniqueName: \"kubernetes.io/projected/94a9648b-0541-4c6e-aa81-ae890d629b25-kube-api-access-q2jmg\") pod \"kuadrant-operator-catalog-ttg4v\" (UID: \"94a9648b-0541-4c6e-aa81-ae890d629b25\") " pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" Apr 20 15:01:09.613110 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.613076 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2jmg\" (UniqueName: \"kubernetes.io/projected/94a9648b-0541-4c6e-aa81-ae890d629b25-kube-api-access-q2jmg\") pod \"kuadrant-operator-catalog-ttg4v\" (UID: \"94a9648b-0541-4c6e-aa81-ae890d629b25\") " pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" Apr 20 15:01:09.621774 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.621743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2jmg\" (UniqueName: \"kubernetes.io/projected/94a9648b-0541-4c6e-aa81-ae890d629b25-kube-api-access-q2jmg\") pod \"kuadrant-operator-catalog-ttg4v\" (UID: \"94a9648b-0541-4c6e-aa81-ae890d629b25\") " pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" Apr 20 15:01:09.694458 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.694394 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" Apr 20 15:01:09.758181 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.758143 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ttg4v"] Apr 20 15:01:09.852255 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.852231 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ttg4v"] Apr 20 15:01:09.854389 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:09.854356 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94a9648b_0541_4c6e_aa81_ae890d629b25.slice/crio-78c06d36c46f0a9737bcf05ccc4840f0695ccb551798d25d27fb1de3b5a542b6 WatchSource:0}: Error finding container 78c06d36c46f0a9737bcf05ccc4840f0695ccb551798d25d27fb1de3b5a542b6: Status 404 returned error can't find the container with id 78c06d36c46f0a9737bcf05ccc4840f0695ccb551798d25d27fb1de3b5a542b6 Apr 20 15:01:09.960343 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.960245 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2ktsp"] Apr 20 15:01:09.963201 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.963184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" Apr 20 15:01:09.970735 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:09.970558 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2ktsp"] Apr 20 15:01:10.118484 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:10.118450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdfhx\" (UniqueName: \"kubernetes.io/projected/6ac24cef-df75-4a36-874e-f7a8348dc133-kube-api-access-pdfhx\") pod \"kuadrant-operator-catalog-2ktsp\" (UID: \"6ac24cef-df75-4a36-874e-f7a8348dc133\") " pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" Apr 20 15:01:10.219652 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:10.219573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdfhx\" (UniqueName: \"kubernetes.io/projected/6ac24cef-df75-4a36-874e-f7a8348dc133-kube-api-access-pdfhx\") pod \"kuadrant-operator-catalog-2ktsp\" (UID: \"6ac24cef-df75-4a36-874e-f7a8348dc133\") " pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" Apr 20 15:01:10.227312 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:10.227291 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdfhx\" (UniqueName: \"kubernetes.io/projected/6ac24cef-df75-4a36-874e-f7a8348dc133-kube-api-access-pdfhx\") pod \"kuadrant-operator-catalog-2ktsp\" (UID: \"6ac24cef-df75-4a36-874e-f7a8348dc133\") " pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" Apr 20 15:01:10.273549 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:10.273512 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" Apr 20 15:01:10.348362 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:10.348328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" event={"ID":"94a9648b-0541-4c6e-aa81-ae890d629b25","Type":"ContainerStarted","Data":"78c06d36c46f0a9737bcf05ccc4840f0695ccb551798d25d27fb1de3b5a542b6"} Apr 20 15:01:10.400403 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:10.400373 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-2ktsp"] Apr 20 15:01:10.401175 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:10.401149 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ac24cef_df75_4a36_874e_f7a8348dc133.slice/crio-3638023b98b1a97c8fc03dc07786ddfb3e7c66a463b327f2ee530a533c711638 WatchSource:0}: Error finding container 3638023b98b1a97c8fc03dc07786ddfb3e7c66a463b327f2ee530a533c711638: Status 404 returned error can't find the container with id 3638023b98b1a97c8fc03dc07786ddfb3e7c66a463b327f2ee530a533c711638 Apr 20 15:01:11.354578 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:11.354540 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" event={"ID":"6ac24cef-df75-4a36-874e-f7a8348dc133","Type":"ContainerStarted","Data":"3638023b98b1a97c8fc03dc07786ddfb3e7c66a463b327f2ee530a533c711638"} Apr 20 15:01:12.359138 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:12.359055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" event={"ID":"94a9648b-0541-4c6e-aa81-ae890d629b25","Type":"ContainerStarted","Data":"18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38"} Apr 20 15:01:12.359555 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:12.359160 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" podUID="94a9648b-0541-4c6e-aa81-ae890d629b25" containerName="registry-server" containerID="cri-o://18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38" gracePeriod=2 Apr 20 15:01:12.360606 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:12.360579 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" event={"ID":"6ac24cef-df75-4a36-874e-f7a8348dc133","Type":"ContainerStarted","Data":"6609a7ff331e64220dcbf8743b517925d2526d7c0887323a8ce26f89c5332fbf"} Apr 20 15:01:12.376707 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:12.376670 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" podStartSLOduration=1.117235057 podStartE2EDuration="3.376659827s" podCreationTimestamp="2026-04-20 15:01:09 +0000 UTC" firstStartedPulling="2026-04-20 15:01:09.855739915 +0000 UTC m=+441.548283392" lastFinishedPulling="2026-04-20 15:01:12.115164683 +0000 UTC m=+443.807708162" observedRunningTime="2026-04-20 15:01:12.374011211 +0000 UTC m=+444.066554707" watchObservedRunningTime="2026-04-20 15:01:12.376659827 +0000 UTC m=+444.069203317" Apr 20 15:01:12.387963 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:12.387929 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" podStartSLOduration=1.672169508 podStartE2EDuration="3.387917742s" podCreationTimestamp="2026-04-20 15:01:09 +0000 UTC" firstStartedPulling="2026-04-20 15:01:10.402513153 +0000 UTC m=+442.095056633" lastFinishedPulling="2026-04-20 15:01:12.118261381 +0000 UTC m=+443.810804867" observedRunningTime="2026-04-20 15:01:12.387220145 +0000 UTC m=+444.079763644" watchObservedRunningTime="2026-04-20 15:01:12.387917742 +0000 UTC m=+444.080461241" Apr 20 15:01:12.592908 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:12.592888 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" Apr 20 15:01:12.746180 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:12.746093 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2jmg\" (UniqueName: \"kubernetes.io/projected/94a9648b-0541-4c6e-aa81-ae890d629b25-kube-api-access-q2jmg\") pod \"94a9648b-0541-4c6e-aa81-ae890d629b25\" (UID: \"94a9648b-0541-4c6e-aa81-ae890d629b25\") " Apr 20 15:01:12.748409 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:12.748386 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a9648b-0541-4c6e-aa81-ae890d629b25-kube-api-access-q2jmg" (OuterVolumeSpecName: "kube-api-access-q2jmg") pod "94a9648b-0541-4c6e-aa81-ae890d629b25" (UID: "94a9648b-0541-4c6e-aa81-ae890d629b25"). InnerVolumeSpecName "kube-api-access-q2jmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:12.846909 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:12.846881 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q2jmg\" (UniqueName: \"kubernetes.io/projected/94a9648b-0541-4c6e-aa81-ae890d629b25-kube-api-access-q2jmg\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:13.365549 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:13.365517 2575 generic.go:358] "Generic (PLEG): container finished" podID="94a9648b-0541-4c6e-aa81-ae890d629b25" containerID="18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38" exitCode=0 Apr 20 15:01:13.365949 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:13.365594 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" Apr 20 15:01:13.365949 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:13.365601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" event={"ID":"94a9648b-0541-4c6e-aa81-ae890d629b25","Type":"ContainerDied","Data":"18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38"} Apr 20 15:01:13.365949 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:13.365631 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-ttg4v" event={"ID":"94a9648b-0541-4c6e-aa81-ae890d629b25","Type":"ContainerDied","Data":"78c06d36c46f0a9737bcf05ccc4840f0695ccb551798d25d27fb1de3b5a542b6"} Apr 20 15:01:13.365949 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:13.365645 2575 scope.go:117] "RemoveContainer" containerID="18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38" Apr 20 15:01:13.374345 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:13.374329 2575 scope.go:117] "RemoveContainer" containerID="18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38" Apr 20 15:01:13.374592 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:01:13.374574 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38\": container with ID starting with 18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38 not found: ID does not exist" containerID="18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38" Apr 20 15:01:13.374639 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:13.374600 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38"} err="failed to get container status \"18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38\": rpc error: code = NotFound desc = could not find container \"18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38\": container with ID starting with 18519c34b041ea87e4f1ac2194b43afbaf4f5666e6d1cfc4b925bf9820a1da38 not found: ID does not exist" Apr 20 15:01:13.380526 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:13.380491 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ttg4v"] Apr 20 15:01:13.383191 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:13.383172 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-ttg4v"] Apr 20 15:01:14.868479 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:14.868449 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a9648b-0541-4c6e-aa81-ae890d629b25" path="/var/lib/kubelet/pods/94a9648b-0541-4c6e-aa81-ae890d629b25/volumes" Apr 20 15:01:20.273995 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:20.273953 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" Apr 20 15:01:20.274399 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:20.274007 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" Apr 20 15:01:20.295611 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:20.295588 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" Apr 20 15:01:20.412202 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:20.412174 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-2ktsp" Apr 20 15:01:24.792453 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.792419 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc"] Apr 20 15:01:24.792816 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.792755 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94a9648b-0541-4c6e-aa81-ae890d629b25" containerName="registry-server" Apr 20 15:01:24.792816 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.792768 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a9648b-0541-4c6e-aa81-ae890d629b25" containerName="registry-server" Apr 20 15:01:24.792905 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.792830 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="94a9648b-0541-4c6e-aa81-ae890d629b25" containerName="registry-server" Apr 20 15:01:24.798932 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.798895 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:24.801481 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.801456 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-2pfqw\"" Apr 20 15:01:24.802053 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.802032 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc"] Apr 20 15:01:24.843962 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.843932 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:24.844113 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.843996 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdtlw\" (UniqueName: \"kubernetes.io/projected/f775f5ff-4d61-4291-9a07-f05295333617-kube-api-access-mdtlw\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:24.844113 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.844079 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:24.944916 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.944889 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:24.945099 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.944942 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdtlw\" (UniqueName: \"kubernetes.io/projected/f775f5ff-4d61-4291-9a07-f05295333617-kube-api-access-mdtlw\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:24.945099 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.944977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:24.945335 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.945312 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:24.945335 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.945329 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:24.957062 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:24.957033 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdtlw\" (UniqueName: \"kubernetes.io/projected/f775f5ff-4d61-4291-9a07-f05295333617-kube-api-access-mdtlw\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:25.109520 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.109446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:25.227192 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.227166 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc"] Apr 20 15:01:25.228267 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:25.228245 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf775f5ff_4d61_4291_9a07_f05295333617.slice/crio-3c3cad660651ec4c7240e65b389ef9b0240af1d0361f8d58cd3bea174ce5c70a WatchSource:0}: Error finding container 3c3cad660651ec4c7240e65b389ef9b0240af1d0361f8d58cd3bea174ce5c70a: Status 404 returned error can't find the container with id 3c3cad660651ec4c7240e65b389ef9b0240af1d0361f8d58cd3bea174ce5c70a Apr 20 15:01:25.409958 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.408624 2575 generic.go:358] "Generic (PLEG): container finished" podID="f775f5ff-4d61-4291-9a07-f05295333617" containerID="97e3e186d007f18f7d38c4438c1b61f9dc75bcad7dc6404e6474353bcb0c2424" exitCode=0 Apr 20 15:01:25.409958 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.408772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" event={"ID":"f775f5ff-4d61-4291-9a07-f05295333617","Type":"ContainerDied","Data":"97e3e186d007f18f7d38c4438c1b61f9dc75bcad7dc6404e6474353bcb0c2424"} Apr 20 15:01:25.409958 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.408801 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" event={"ID":"f775f5ff-4d61-4291-9a07-f05295333617","Type":"ContainerStarted","Data":"3c3cad660651ec4c7240e65b389ef9b0240af1d0361f8d58cd3bea174ce5c70a"} Apr 20 15:01:25.594444 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.594412 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x"] Apr 20 15:01:25.596872 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.596856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.604693 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.604674 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x"] Apr 20 15:01:25.650806 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.650771 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.650962 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.650811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.650962 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.650878 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n8kd\" (UniqueName: \"kubernetes.io/projected/35bbc67c-5dc1-440c-a1af-8f36a06c132d-kube-api-access-6n8kd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.751453 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.751362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.751453 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.751413 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.751646 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.751462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n8kd\" (UniqueName: \"kubernetes.io/projected/35bbc67c-5dc1-440c-a1af-8f36a06c132d-kube-api-access-6n8kd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.751763 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.751743 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.751815 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.751772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.759158 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.759129 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n8kd\" (UniqueName: \"kubernetes.io/projected/35bbc67c-5dc1-440c-a1af-8f36a06c132d-kube-api-access-6n8kd\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:25.907185 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:25.907156 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:26.029880 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.029854 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x"] Apr 20 15:01:26.031152 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:26.031125 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35bbc67c_5dc1_440c_a1af_8f36a06c132d.slice/crio-7a29f2f6a64610fd517f87d17610d2e328bab08f094fe7ca28ed25cc950d91f5 WatchSource:0}: Error finding container 7a29f2f6a64610fd517f87d17610d2e328bab08f094fe7ca28ed25cc950d91f5: Status 404 returned error can't find the container with id 7a29f2f6a64610fd517f87d17610d2e328bab08f094fe7ca28ed25cc950d91f5 Apr 20 15:01:26.194971 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.194934 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k"] Apr 20 15:01:26.197408 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.197387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.204609 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.204579 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k"] Apr 20 15:01:26.255351 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.255324 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.255505 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.255391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpvws\" (UniqueName: \"kubernetes.io/projected/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-kube-api-access-qpvws\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.255505 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.255411 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.356611 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.356539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.356750 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.356625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qpvws\" (UniqueName: \"kubernetes.io/projected/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-kube-api-access-qpvws\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.356750 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.356659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.356989 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.356970 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.357080 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.357004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.364122 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.364089 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpvws\" (UniqueName: \"kubernetes.io/projected/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-kube-api-access-qpvws\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.414254 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.414115 2575 generic.go:358] "Generic (PLEG): container finished" podID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerID="c82632e1e617cc618f44f6830b53d4e80c6819414711d34dd9d32b69e7ea3751" exitCode=0 Apr 20 15:01:26.414254 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.414197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" event={"ID":"35bbc67c-5dc1-440c-a1af-8f36a06c132d","Type":"ContainerDied","Data":"c82632e1e617cc618f44f6830b53d4e80c6819414711d34dd9d32b69e7ea3751"} Apr 20 15:01:26.414254 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.414227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" event={"ID":"35bbc67c-5dc1-440c-a1af-8f36a06c132d","Type":"ContainerStarted","Data":"7a29f2f6a64610fd517f87d17610d2e328bab08f094fe7ca28ed25cc950d91f5"} Apr 20 15:01:26.506976 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.506956 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:26.595148 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.595108 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67"] Apr 20 15:01:26.600765 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.600739 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.604964 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.604934 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67"] Apr 20 15:01:26.632498 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.632475 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k"] Apr 20 15:01:26.634632 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:26.634607 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf8d3902_d7ed_48e5_b3ea_1cb78d370f74.slice/crio-72d174ff9aa2d646b419fdd5c2bc04c2d9b0121f31fa9b3af11cf7b0f069f73e WatchSource:0}: Error finding container 72d174ff9aa2d646b419fdd5c2bc04c2d9b0121f31fa9b3af11cf7b0f069f73e: Status 404 returned error can't find the container with id 72d174ff9aa2d646b419fdd5c2bc04c2d9b0121f31fa9b3af11cf7b0f069f73e Apr 20 15:01:26.659048 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.659001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6htwg\" (UniqueName: \"kubernetes.io/projected/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-kube-api-access-6htwg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.659136 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.659089 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.659172 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.659154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.759554 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.759526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6htwg\" (UniqueName: \"kubernetes.io/projected/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-kube-api-access-6htwg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.759654 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.759559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.759654 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.759614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.759983 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.759965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.760060 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.759990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.767723 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.767703 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6htwg\" (UniqueName: \"kubernetes.io/projected/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-kube-api-access-6htwg\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:26.914221 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:26.914139 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:27.043088 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.043035 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67"] Apr 20 15:01:27.056286 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:27.056250 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c2f3f9c_039c_4593_bb74_577b0e5c7a18.slice/crio-e326f4f47815130d0af4c7897e081a2091d5af1c180f3ec8ab7442ed49b2525a WatchSource:0}: Error finding container e326f4f47815130d0af4c7897e081a2091d5af1c180f3ec8ab7442ed49b2525a: Status 404 returned error can't find the container with id e326f4f47815130d0af4c7897e081a2091d5af1c180f3ec8ab7442ed49b2525a Apr 20 15:01:27.419605 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.419525 2575 generic.go:358] "Generic (PLEG): container finished" podID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerID="c46d6e68bb1527ea7b663fd8677bef02af818cf47b00131ada1afd4c0de6bc23" exitCode=0 Apr 20 15:01:27.419758 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.419619 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" event={"ID":"6c2f3f9c-039c-4593-bb74-577b0e5c7a18","Type":"ContainerDied","Data":"c46d6e68bb1527ea7b663fd8677bef02af818cf47b00131ada1afd4c0de6bc23"} Apr 20 15:01:27.419758 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.419659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" event={"ID":"6c2f3f9c-039c-4593-bb74-577b0e5c7a18","Type":"ContainerStarted","Data":"e326f4f47815130d0af4c7897e081a2091d5af1c180f3ec8ab7442ed49b2525a"} Apr 20 15:01:27.421047 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.421012 2575 generic.go:358] "Generic (PLEG): container finished" podID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerID="729ace1e320d38a9698c5b276efce0306c6392d86437e1aa681705ca718f2dc1" exitCode=0 Apr 20 15:01:27.421168 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.421094 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" event={"ID":"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74","Type":"ContainerDied","Data":"729ace1e320d38a9698c5b276efce0306c6392d86437e1aa681705ca718f2dc1"} Apr 20 15:01:27.421168 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.421123 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" event={"ID":"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74","Type":"ContainerStarted","Data":"72d174ff9aa2d646b419fdd5c2bc04c2d9b0121f31fa9b3af11cf7b0f069f73e"} Apr 20 15:01:27.422882 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.422833 2575 generic.go:358] "Generic (PLEG): container finished" podID="f775f5ff-4d61-4291-9a07-f05295333617" containerID="e8040be18c78ffe0baf19b474aafb882879410afa25e01f0f40f3acf4ecee752" exitCode=0 Apr 20 15:01:27.423080 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.422918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" event={"ID":"f775f5ff-4d61-4291-9a07-f05295333617","Type":"ContainerDied","Data":"e8040be18c78ffe0baf19b474aafb882879410afa25e01f0f40f3acf4ecee752"} Apr 20 15:01:27.424719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.424684 2575 generic.go:358] "Generic (PLEG): container finished" podID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerID="5092b60798eb22255e07a7e9190b914021c707a68609da4b8ff1da4171c62400" exitCode=0 Apr 20 15:01:27.424719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:27.424714 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" event={"ID":"35bbc67c-5dc1-440c-a1af-8f36a06c132d","Type":"ContainerDied","Data":"5092b60798eb22255e07a7e9190b914021c707a68609da4b8ff1da4171c62400"} Apr 20 15:01:28.431492 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:28.431455 2575 generic.go:358] "Generic (PLEG): container finished" podID="f775f5ff-4d61-4291-9a07-f05295333617" containerID="ca355617df408893209bc5afe2b9475fad5d8d05276e13102eba79cd4ff61244" exitCode=0 Apr 20 15:01:28.431866 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:28.431536 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" event={"ID":"f775f5ff-4d61-4291-9a07-f05295333617","Type":"ContainerDied","Data":"ca355617df408893209bc5afe2b9475fad5d8d05276e13102eba79cd4ff61244"} Apr 20 15:01:28.433440 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:28.433415 2575 generic.go:358] "Generic (PLEG): container finished" podID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerID="bc75ee57055dd7a07a8048a4a980b856ea44eec410b6da15dbaaa5ec1189b16b" exitCode=0 Apr 20 15:01:28.433538 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:28.433484 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" event={"ID":"35bbc67c-5dc1-440c-a1af-8f36a06c132d","Type":"ContainerDied","Data":"bc75ee57055dd7a07a8048a4a980b856ea44eec410b6da15dbaaa5ec1189b16b"} Apr 20 15:01:28.435149 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:28.435117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" event={"ID":"6c2f3f9c-039c-4593-bb74-577b0e5c7a18","Type":"ContainerStarted","Data":"2e3fa2002be2b73235771ef96a33bf274f95195d8bce6553629eaf772191c668"} Apr 20 15:01:28.436846 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:28.436819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" event={"ID":"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74","Type":"ContainerStarted","Data":"011cc9624a8b1ea075a6c9312933417a6f0c76535cdacce7ee2a4ceb766102ee"} Apr 20 15:01:29.443059 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.443004 2575 generic.go:358] "Generic (PLEG): container finished" podID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerID="2e3fa2002be2b73235771ef96a33bf274f95195d8bce6553629eaf772191c668" exitCode=0 Apr 20 15:01:29.443059 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.443050 2575 generic.go:358] "Generic (PLEG): container finished" podID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerID="95bade87cdd68b4cd3818e4552ef0937c77f826f91d2535e718f6efc9a1db8a0" exitCode=0 Apr 20 15:01:29.443555 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.443091 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" event={"ID":"6c2f3f9c-039c-4593-bb74-577b0e5c7a18","Type":"ContainerDied","Data":"2e3fa2002be2b73235771ef96a33bf274f95195d8bce6553629eaf772191c668"} Apr 20 15:01:29.443555 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.443132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" event={"ID":"6c2f3f9c-039c-4593-bb74-577b0e5c7a18","Type":"ContainerDied","Data":"95bade87cdd68b4cd3818e4552ef0937c77f826f91d2535e718f6efc9a1db8a0"} Apr 20 15:01:29.444964 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.444943 2575 generic.go:358] "Generic (PLEG): container finished" podID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerID="011cc9624a8b1ea075a6c9312933417a6f0c76535cdacce7ee2a4ceb766102ee" exitCode=0 Apr 20 15:01:29.444964 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.444963 2575 generic.go:358] "Generic (PLEG): container finished" podID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerID="ad3013ea8107d94c54fd745e5597c98273e178fb74dcc90bd1bde609d652ae14" exitCode=0 Apr 20 15:01:29.445128 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.444972 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" event={"ID":"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74","Type":"ContainerDied","Data":"011cc9624a8b1ea075a6c9312933417a6f0c76535cdacce7ee2a4ceb766102ee"} Apr 20 15:01:29.445128 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.445010 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" event={"ID":"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74","Type":"ContainerDied","Data":"ad3013ea8107d94c54fd745e5597c98273e178fb74dcc90bd1bde609d652ae14"} Apr 20 15:01:29.581161 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.581015 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:29.607743 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.607717 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:29.687471 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.687443 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-util\") pod \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " Apr 20 15:01:29.687648 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.687483 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-bundle\") pod \"f775f5ff-4d61-4291-9a07-f05295333617\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " Apr 20 15:01:29.687648 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.687521 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-util\") pod \"f775f5ff-4d61-4291-9a07-f05295333617\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " Apr 20 15:01:29.687648 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.687557 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-bundle\") pod \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " Apr 20 15:01:29.687648 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.687610 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdtlw\" (UniqueName: \"kubernetes.io/projected/f775f5ff-4d61-4291-9a07-f05295333617-kube-api-access-mdtlw\") pod \"f775f5ff-4d61-4291-9a07-f05295333617\" (UID: \"f775f5ff-4d61-4291-9a07-f05295333617\") " Apr 20 15:01:29.687648 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.687642 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n8kd\" (UniqueName: \"kubernetes.io/projected/35bbc67c-5dc1-440c-a1af-8f36a06c132d-kube-api-access-6n8kd\") pod \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\" (UID: \"35bbc67c-5dc1-440c-a1af-8f36a06c132d\") " Apr 20 15:01:29.688076 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.688046 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-bundle" (OuterVolumeSpecName: "bundle") pod "f775f5ff-4d61-4291-9a07-f05295333617" (UID: "f775f5ff-4d61-4291-9a07-f05295333617"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:29.688179 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.688136 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-bundle" (OuterVolumeSpecName: "bundle") pod "35bbc67c-5dc1-440c-a1af-8f36a06c132d" (UID: "35bbc67c-5dc1-440c-a1af-8f36a06c132d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:29.689910 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.689882 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f775f5ff-4d61-4291-9a07-f05295333617-kube-api-access-mdtlw" (OuterVolumeSpecName: "kube-api-access-mdtlw") pod "f775f5ff-4d61-4291-9a07-f05295333617" (UID: "f775f5ff-4d61-4291-9a07-f05295333617"). InnerVolumeSpecName "kube-api-access-mdtlw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:29.689983 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.689959 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35bbc67c-5dc1-440c-a1af-8f36a06c132d-kube-api-access-6n8kd" (OuterVolumeSpecName: "kube-api-access-6n8kd") pod "35bbc67c-5dc1-440c-a1af-8f36a06c132d" (UID: "35bbc67c-5dc1-440c-a1af-8f36a06c132d"). InnerVolumeSpecName "kube-api-access-6n8kd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:29.692716 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.692692 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-util" (OuterVolumeSpecName: "util") pod "35bbc67c-5dc1-440c-a1af-8f36a06c132d" (UID: "35bbc67c-5dc1-440c-a1af-8f36a06c132d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:29.693452 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.693410 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-util" (OuterVolumeSpecName: "util") pod "f775f5ff-4d61-4291-9a07-f05295333617" (UID: "f775f5ff-4d61-4291-9a07-f05295333617"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:29.788732 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.788710 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:29.788732 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.788732 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mdtlw\" (UniqueName: \"kubernetes.io/projected/f775f5ff-4d61-4291-9a07-f05295333617-kube-api-access-mdtlw\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:29.788873 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.788742 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6n8kd\" (UniqueName: \"kubernetes.io/projected/35bbc67c-5dc1-440c-a1af-8f36a06c132d-kube-api-access-6n8kd\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:29.788873 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.788752 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/35bbc67c-5dc1-440c-a1af-8f36a06c132d-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:29.788873 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.788761 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:29.788873 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:29.788769 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f775f5ff-4d61-4291-9a07-f05295333617-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:30.449853 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.449827 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" Apr 20 15:01:30.450265 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.449817 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc" event={"ID":"f775f5ff-4d61-4291-9a07-f05295333617","Type":"ContainerDied","Data":"3c3cad660651ec4c7240e65b389ef9b0240af1d0361f8d58cd3bea174ce5c70a"} Apr 20 15:01:30.450265 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.449935 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c3cad660651ec4c7240e65b389ef9b0240af1d0361f8d58cd3bea174ce5c70a" Apr 20 15:01:30.451508 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.451487 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" Apr 20 15:01:30.451615 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.451486 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x" event={"ID":"35bbc67c-5dc1-440c-a1af-8f36a06c132d","Type":"ContainerDied","Data":"7a29f2f6a64610fd517f87d17610d2e328bab08f094fe7ca28ed25cc950d91f5"} Apr 20 15:01:30.451615 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.451589 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a29f2f6a64610fd517f87d17610d2e328bab08f094fe7ca28ed25cc950d91f5" Apr 20 15:01:30.602894 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.602867 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:30.606082 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.606056 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:30.696764 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.696732 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6htwg\" (UniqueName: \"kubernetes.io/projected/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-kube-api-access-6htwg\") pod \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " Apr 20 15:01:30.696950 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.696817 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-bundle\") pod \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " Apr 20 15:01:30.696950 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.696846 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-bundle\") pod \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " Apr 20 15:01:30.696950 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.696864 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-util\") pod \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " Apr 20 15:01:30.696950 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.696887 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpvws\" (UniqueName: \"kubernetes.io/projected/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-kube-api-access-qpvws\") pod \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\" (UID: \"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74\") " Apr 20 15:01:30.696950 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.696919 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-util\") pod \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\" (UID: \"6c2f3f9c-039c-4593-bb74-577b0e5c7a18\") " Apr 20 15:01:30.697541 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.697496 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-bundle" (OuterVolumeSpecName: "bundle") pod "cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" (UID: "cf8d3902-d7ed-48e5-b3ea-1cb78d370f74"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:30.697667 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.697638 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-bundle" (OuterVolumeSpecName: "bundle") pod "6c2f3f9c-039c-4593-bb74-577b0e5c7a18" (UID: "6c2f3f9c-039c-4593-bb74-577b0e5c7a18"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:30.699323 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.699297 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-kube-api-access-6htwg" (OuterVolumeSpecName: "kube-api-access-6htwg") pod "6c2f3f9c-039c-4593-bb74-577b0e5c7a18" (UID: "6c2f3f9c-039c-4593-bb74-577b0e5c7a18"). InnerVolumeSpecName "kube-api-access-6htwg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:30.699607 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.699581 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-kube-api-access-qpvws" (OuterVolumeSpecName: "kube-api-access-qpvws") pod "cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" (UID: "cf8d3902-d7ed-48e5-b3ea-1cb78d370f74"). InnerVolumeSpecName "kube-api-access-qpvws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:30.702758 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.702706 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-util" (OuterVolumeSpecName: "util") pod "6c2f3f9c-039c-4593-bb74-577b0e5c7a18" (UID: "6c2f3f9c-039c-4593-bb74-577b0e5c7a18"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:30.703061 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.703040 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-util" (OuterVolumeSpecName: "util") pod "cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" (UID: "cf8d3902-d7ed-48e5-b3ea-1cb78d370f74"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:30.797878 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.797837 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:30.797878 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.797873 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6htwg\" (UniqueName: \"kubernetes.io/projected/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-kube-api-access-6htwg\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:30.798117 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.797888 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:30.798117 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.797900 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c2f3f9c-039c-4593-bb74-577b0e5c7a18-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:30.798117 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.797911 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-util\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:30.798117 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:30.797922 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qpvws\" (UniqueName: \"kubernetes.io/projected/cf8d3902-d7ed-48e5-b3ea-1cb78d370f74-kube-api-access-qpvws\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:31.456770 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:31.456727 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" event={"ID":"6c2f3f9c-039c-4593-bb74-577b0e5c7a18","Type":"ContainerDied","Data":"e326f4f47815130d0af4c7897e081a2091d5af1c180f3ec8ab7442ed49b2525a"} Apr 20 15:01:31.456770 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:31.456770 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e326f4f47815130d0af4c7897e081a2091d5af1c180f3ec8ab7442ed49b2525a" Apr 20 15:01:31.457324 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:31.456792 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67" Apr 20 15:01:31.458520 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:31.458504 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" Apr 20 15:01:31.458623 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:31.458534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k" event={"ID":"cf8d3902-d7ed-48e5-b3ea-1cb78d370f74","Type":"ContainerDied","Data":"72d174ff9aa2d646b419fdd5c2bc04c2d9b0121f31fa9b3af11cf7b0f069f73e"} Apr 20 15:01:31.458623 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:31.458558 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72d174ff9aa2d646b419fdd5c2bc04c2d9b0121f31fa9b3af11cf7b0f069f73e" Apr 20 15:01:37.195623 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.195584 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw"] Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.195936 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f775f5ff-4d61-4291-9a07-f05295333617" containerName="extract" Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.195950 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f775f5ff-4d61-4291-9a07-f05295333617" containerName="extract" Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.195965 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerName="util" Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.195971 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerName="util" Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.195977 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerName="extract" Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.195982 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerName="extract" Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.195990 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f775f5ff-4d61-4291-9a07-f05295333617" containerName="pull" Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.195994 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f775f5ff-4d61-4291-9a07-f05295333617" containerName="pull" Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196001 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerName="util" Apr 20 15:01:37.196014 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196014 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerName="util" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196052 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerName="pull" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196057 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerName="pull" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196065 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f775f5ff-4d61-4291-9a07-f05295333617" containerName="util" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196069 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="f775f5ff-4d61-4291-9a07-f05295333617" containerName="util" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196074 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerName="pull" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196078 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerName="pull" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196083 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerName="extract" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196087 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerName="extract" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196093 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerName="util" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196098 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerName="util" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196103 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerName="pull" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196108 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerName="pull" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196119 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerName="extract" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196127 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerName="extract" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196177 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="f775f5ff-4d61-4291-9a07-f05295333617" containerName="extract" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196183 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="6c2f3f9c-039c-4593-bb74-577b0e5c7a18" containerName="extract" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196198 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf8d3902-d7ed-48e5-b3ea-1cb78d370f74" containerName="extract" Apr 20 15:01:37.196334 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.196208 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="35bbc67c-5dc1-440c-a1af-8f36a06c132d" containerName="extract" Apr 20 15:01:37.198259 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.198243 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:37.200544 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.200519 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6qqhv\"" Apr 20 15:01:37.210814 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.210791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw"] Apr 20 15:01:37.255743 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.255719 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dshrj\" (UniqueName: \"kubernetes.io/projected/8e4c8b7d-a018-4d08-bd35-863a9cc06581-kube-api-access-dshrj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" (UID: \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:37.255857 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.255759 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e4c8b7d-a018-4d08-bd35-863a9cc06581-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" (UID: \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:37.356752 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.356723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dshrj\" (UniqueName: \"kubernetes.io/projected/8e4c8b7d-a018-4d08-bd35-863a9cc06581-kube-api-access-dshrj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" (UID: \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:37.356895 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.356764 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e4c8b7d-a018-4d08-bd35-863a9cc06581-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" (UID: \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:37.357108 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.357092 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e4c8b7d-a018-4d08-bd35-863a9cc06581-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" (UID: \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:37.383583 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.383556 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dshrj\" (UniqueName: \"kubernetes.io/projected/8e4c8b7d-a018-4d08-bd35-863a9cc06581-kube-api-access-dshrj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" (UID: \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:37.508227 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.508149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:37.639503 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:37.639478 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw"] Apr 20 15:01:37.642230 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:37.642199 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e4c8b7d_a018_4d08_bd35_863a9cc06581.slice/crio-55d8060789af7b3f7cad222e02ca97842007f6c30cb3420f93d0d29f9d0f86bd WatchSource:0}: Error finding container 55d8060789af7b3f7cad222e02ca97842007f6c30cb3420f93d0d29f9d0f86bd: Status 404 returned error can't find the container with id 55d8060789af7b3f7cad222e02ca97842007f6c30cb3420f93d0d29f9d0f86bd Apr 20 15:01:38.487533 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:38.487499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" event={"ID":"8e4c8b7d-a018-4d08-bd35-863a9cc06581","Type":"ContainerStarted","Data":"55d8060789af7b3f7cad222e02ca97842007f6c30cb3420f93d0d29f9d0f86bd"} Apr 20 15:01:42.968409 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:42.968365 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5"] Apr 20 15:01:42.976719 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:42.976688 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" Apr 20 15:01:42.979219 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:42.979162 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5"] Apr 20 15:01:42.979690 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:42.979664 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-4szxt\"" Apr 20 15:01:42.979852 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:42.979819 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 15:01:43.110541 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:43.110518 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m52lv\" (UniqueName: \"kubernetes.io/projected/f9be6cbd-aafd-4893-94fe-67163f1587c3-kube-api-access-m52lv\") pod \"dns-operator-controller-manager-648d5c98bc-2hcj5\" (UID: \"f9be6cbd-aafd-4893-94fe-67163f1587c3\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" Apr 20 15:01:43.211808 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:43.211775 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m52lv\" (UniqueName: \"kubernetes.io/projected/f9be6cbd-aafd-4893-94fe-67163f1587c3-kube-api-access-m52lv\") pod \"dns-operator-controller-manager-648d5c98bc-2hcj5\" (UID: \"f9be6cbd-aafd-4893-94fe-67163f1587c3\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" Apr 20 15:01:43.219905 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:43.219841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m52lv\" (UniqueName: \"kubernetes.io/projected/f9be6cbd-aafd-4893-94fe-67163f1587c3-kube-api-access-m52lv\") pod \"dns-operator-controller-manager-648d5c98bc-2hcj5\" (UID: \"f9be6cbd-aafd-4893-94fe-67163f1587c3\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" Apr 20 15:01:43.290815 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:43.290784 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" Apr 20 15:01:43.425680 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:43.425655 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5"] Apr 20 15:01:43.427480 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:43.427451 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9be6cbd_aafd_4893_94fe_67163f1587c3.slice/crio-51e7143c09d68e927f2d65b62d8c1deb8d975fb55d1f9934fc591ef4f63b3de0 WatchSource:0}: Error finding container 51e7143c09d68e927f2d65b62d8c1deb8d975fb55d1f9934fc591ef4f63b3de0: Status 404 returned error can't find the container with id 51e7143c09d68e927f2d65b62d8c1deb8d975fb55d1f9934fc591ef4f63b3de0 Apr 20 15:01:43.508724 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:43.508632 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" event={"ID":"8e4c8b7d-a018-4d08-bd35-863a9cc06581","Type":"ContainerStarted","Data":"9bb513c468b6ad7a96dbdf5beb35cee27547bb0bd8b461a24f9759ba30342157"} Apr 20 15:01:43.508872 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:43.508768 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:43.509740 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:43.509720 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" event={"ID":"f9be6cbd-aafd-4893-94fe-67163f1587c3","Type":"ContainerStarted","Data":"51e7143c09d68e927f2d65b62d8c1deb8d975fb55d1f9934fc591ef4f63b3de0"} Apr 20 15:01:43.529116 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:43.529074 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" podStartSLOduration=1.082330748 podStartE2EDuration="6.529061882s" podCreationTimestamp="2026-04-20 15:01:37 +0000 UTC" firstStartedPulling="2026-04-20 15:01:37.64465391 +0000 UTC m=+469.337197390" lastFinishedPulling="2026-04-20 15:01:43.091385046 +0000 UTC m=+474.783928524" observedRunningTime="2026-04-20 15:01:43.527974552 +0000 UTC m=+475.220518050" watchObservedRunningTime="2026-04-20 15:01:43.529061882 +0000 UTC m=+475.221605382" Apr 20 15:01:46.131907 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.131874 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7"] Apr 20 15:01:46.135813 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.135773 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.138215 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.138190 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 15:01:46.138348 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.138215 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 15:01:46.138348 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.138190 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-2pfqw\"" Apr 20 15:01:46.144068 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.144045 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7"] Apr 20 15:01:46.238780 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.238741 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/97f7f61e-7044-4647-b569-38e5b28ec72c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qmdj7\" (UID: \"97f7f61e-7044-4647-b569-38e5b28ec72c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.238967 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.238895 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhsf9\" (UniqueName: \"kubernetes.io/projected/97f7f61e-7044-4647-b569-38e5b28ec72c-kube-api-access-jhsf9\") pod \"kuadrant-console-plugin-6cb54b5c86-qmdj7\" (UID: \"97f7f61e-7044-4647-b569-38e5b28ec72c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.238967 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.238958 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/97f7f61e-7044-4647-b569-38e5b28ec72c-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-qmdj7\" (UID: \"97f7f61e-7044-4647-b569-38e5b28ec72c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.339814 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.339779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhsf9\" (UniqueName: \"kubernetes.io/projected/97f7f61e-7044-4647-b569-38e5b28ec72c-kube-api-access-jhsf9\") pod \"kuadrant-console-plugin-6cb54b5c86-qmdj7\" (UID: \"97f7f61e-7044-4647-b569-38e5b28ec72c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.340152 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.339824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/97f7f61e-7044-4647-b569-38e5b28ec72c-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-qmdj7\" (UID: \"97f7f61e-7044-4647-b569-38e5b28ec72c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.340152 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.339881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/97f7f61e-7044-4647-b569-38e5b28ec72c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qmdj7\" (UID: \"97f7f61e-7044-4647-b569-38e5b28ec72c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.347045 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.341168 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/97f7f61e-7044-4647-b569-38e5b28ec72c-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-qmdj7\" (UID: \"97f7f61e-7044-4647-b569-38e5b28ec72c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.347454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.347428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/97f7f61e-7044-4647-b569-38e5b28ec72c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-qmdj7\" (UID: \"97f7f61e-7044-4647-b569-38e5b28ec72c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.353246 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.353224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhsf9\" (UniqueName: \"kubernetes.io/projected/97f7f61e-7044-4647-b569-38e5b28ec72c-kube-api-access-jhsf9\") pod \"kuadrant-console-plugin-6cb54b5c86-qmdj7\" (UID: \"97f7f61e-7044-4647-b569-38e5b28ec72c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.449282 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.449254 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" Apr 20 15:01:46.524883 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.524851 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" event={"ID":"f9be6cbd-aafd-4893-94fe-67163f1587c3","Type":"ContainerStarted","Data":"aabdb595fb101e080ce515814ddbb362899c78f7d790debcc36aa68bc9e4328d"} Apr 20 15:01:46.525037 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.524992 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" Apr 20 15:01:46.543927 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.543870 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" podStartSLOduration=1.620108646 podStartE2EDuration="4.54385108s" podCreationTimestamp="2026-04-20 15:01:42 +0000 UTC" firstStartedPulling="2026-04-20 15:01:43.429357089 +0000 UTC m=+475.121900567" lastFinishedPulling="2026-04-20 15:01:46.353099516 +0000 UTC m=+478.045643001" observedRunningTime="2026-04-20 15:01:46.543266071 +0000 UTC m=+478.235809573" watchObservedRunningTime="2026-04-20 15:01:46.54385108 +0000 UTC m=+478.236394581" Apr 20 15:01:46.579546 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:46.579509 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7"] Apr 20 15:01:46.580691 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:46.580662 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97f7f61e_7044_4647_b569_38e5b28ec72c.slice/crio-4f56cfa61ea1ef30c7de032d4724522f362644e5cffe4fa7731b954d3fe7c984 WatchSource:0}: Error finding container 4f56cfa61ea1ef30c7de032d4724522f362644e5cffe4fa7731b954d3fe7c984: Status 404 returned error can't find the container with id 4f56cfa61ea1ef30c7de032d4724522f362644e5cffe4fa7731b954d3fe7c984 Apr 20 15:01:47.529769 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:47.529727 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" event={"ID":"97f7f61e-7044-4647-b569-38e5b28ec72c","Type":"ContainerStarted","Data":"4f56cfa61ea1ef30c7de032d4724522f362644e5cffe4fa7731b954d3fe7c984"} Apr 20 15:01:54.516918 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:54.516882 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:56.293131 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.293091 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2"] Apr 20 15:01:56.295584 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.295558 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" Apr 20 15:01:56.312549 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.312524 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2"] Apr 20 15:01:56.372450 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.372322 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2"] Apr 20 15:01:56.372929 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:01:56.372898 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[extensions-socket-volume kube-api-access-zx8pj], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" podUID="27e912c6-5fd4-42fa-8cf3-20011b79def9" Apr 20 15:01:56.376296 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.376269 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2"] Apr 20 15:01:56.383089 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.383061 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw"] Apr 20 15:01:56.383309 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.383282 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" containerName="manager" containerID="cri-o://9bb513c468b6ad7a96dbdf5beb35cee27547bb0bd8b461a24f9759ba30342157" gracePeriod=2 Apr 20 15:01:56.387957 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.387929 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp"] Apr 20 15:01:56.397270 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.397247 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw"] Apr 20 15:01:56.397704 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.397389 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:01:56.400617 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.400303 2575 status_manager.go:895] "Failed to get status for pod" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:01:56.404405 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.404381 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp"] Apr 20 15:01:56.408727 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.408696 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8"] Apr 20 15:01:56.409271 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.409247 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" containerName="manager" Apr 20 15:01:56.409354 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.409289 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" containerName="manager" Apr 20 15:01:56.409453 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.409434 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" containerName="manager" Apr 20 15:01:56.413140 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.413119 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:01:56.423984 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.423960 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8"] Apr 20 15:01:56.425478 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.425443 2575 status_manager.go:895] "Failed to get status for pod" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:01:56.436887 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.436866 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx8pj\" (UniqueName: \"kubernetes.io/projected/27e912c6-5fd4-42fa-8cf3-20011b79def9-kube-api-access-zx8pj\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" (UID: \"27e912c6-5fd4-42fa-8cf3-20011b79def9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" Apr 20 15:01:56.437006 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.436920 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/27e912c6-5fd4-42fa-8cf3-20011b79def9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" (UID: \"27e912c6-5fd4-42fa-8cf3-20011b79def9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" Apr 20 15:01:56.538171 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.538125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zx8pj\" (UniqueName: \"kubernetes.io/projected/27e912c6-5fd4-42fa-8cf3-20011b79def9-kube-api-access-zx8pj\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" (UID: \"27e912c6-5fd4-42fa-8cf3-20011b79def9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" Apr 20 15:01:56.538171 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.538171 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c29776ad-873d-4da8-9ea6-fd3cde64ac51-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgpsp\" (UID: \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:01:56.538411 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.538220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/27e912c6-5fd4-42fa-8cf3-20011b79def9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" (UID: \"27e912c6-5fd4-42fa-8cf3-20011b79def9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" Apr 20 15:01:56.538411 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.538247 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5ea4974-b456-45f3-aab4-c493274c53b0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ws9x8\" (UID: \"e5ea4974-b456-45f3-aab4-c493274c53b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:01:56.538411 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.538335 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566tj\" (UniqueName: \"kubernetes.io/projected/e5ea4974-b456-45f3-aab4-c493274c53b0-kube-api-access-566tj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ws9x8\" (UID: \"e5ea4974-b456-45f3-aab4-c493274c53b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:01:56.538575 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.538481 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hs4d\" (UniqueName: \"kubernetes.io/projected/c29776ad-873d-4da8-9ea6-fd3cde64ac51-kube-api-access-6hs4d\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgpsp\" (UID: \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:01:56.538703 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.538680 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/27e912c6-5fd4-42fa-8cf3-20011b79def9-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" (UID: \"27e912c6-5fd4-42fa-8cf3-20011b79def9\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" Apr 20 15:01:56.543093 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:01:56.543066 2575 projected.go:194] Error preparing data for projected volume kube-api-access-zx8pj for pod kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2: failed to fetch token: pod "kuadrant-operator-controller-manager-55c7f4c975-vgnh2" not found Apr 20 15:01:56.543256 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:01:56.543154 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27e912c6-5fd4-42fa-8cf3-20011b79def9-kube-api-access-zx8pj podName:27e912c6-5fd4-42fa-8cf3-20011b79def9 nodeName:}" failed. No retries permitted until 2026-04-20 15:01:57.043131705 +0000 UTC m=+488.735675183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zx8pj" (UniqueName: "kubernetes.io/projected/27e912c6-5fd4-42fa-8cf3-20011b79def9-kube-api-access-zx8pj") pod "kuadrant-operator-controller-manager-55c7f4c975-vgnh2" (UID: "27e912c6-5fd4-42fa-8cf3-20011b79def9") : failed to fetch token: pod "kuadrant-operator-controller-manager-55c7f4c975-vgnh2" not found Apr 20 15:01:56.575221 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.575174 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" containerID="9bb513c468b6ad7a96dbdf5beb35cee27547bb0bd8b461a24f9759ba30342157" exitCode=0 Apr 20 15:01:56.575406 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.575336 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" Apr 20 15:01:56.577676 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.577642 2575 status_manager.go:895] "Failed to get status for pod" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:01:56.579461 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.579424 2575 status_manager.go:895] "Failed to get status for pod" podUID="27e912c6-5fd4-42fa-8cf3-20011b79def9" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:01:56.582205 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.582186 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" Apr 20 15:01:56.584226 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.584195 2575 status_manager.go:895] "Failed to get status for pod" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:01:56.586109 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.586079 2575 status_manager.go:895] "Failed to get status for pod" podUID="27e912c6-5fd4-42fa-8cf3-20011b79def9" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:01:56.638958 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.638930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hs4d\" (UniqueName: \"kubernetes.io/projected/c29776ad-873d-4da8-9ea6-fd3cde64ac51-kube-api-access-6hs4d\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgpsp\" (UID: \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:01:56.639124 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.638996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c29776ad-873d-4da8-9ea6-fd3cde64ac51-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgpsp\" (UID: \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:01:56.639124 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.639065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5ea4974-b456-45f3-aab4-c493274c53b0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ws9x8\" (UID: \"e5ea4974-b456-45f3-aab4-c493274c53b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:01:56.639258 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.639124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-566tj\" (UniqueName: \"kubernetes.io/projected/e5ea4974-b456-45f3-aab4-c493274c53b0-kube-api-access-566tj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ws9x8\" (UID: \"e5ea4974-b456-45f3-aab4-c493274c53b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:01:56.639699 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.639670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5ea4974-b456-45f3-aab4-c493274c53b0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ws9x8\" (UID: \"e5ea4974-b456-45f3-aab4-c493274c53b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:01:56.639811 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.639691 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c29776ad-873d-4da8-9ea6-fd3cde64ac51-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgpsp\" (UID: \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:01:56.644758 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.644736 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:56.646779 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.646742 2575 status_manager.go:895] "Failed to get status for pod" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-k52bw\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:01:56.648649 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.648617 2575 status_manager.go:895] "Failed to get status for pod" podUID="27e912c6-5fd4-42fa-8cf3-20011b79def9" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:01:56.648761 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.648719 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hs4d\" (UniqueName: \"kubernetes.io/projected/c29776ad-873d-4da8-9ea6-fd3cde64ac51-kube-api-access-6hs4d\") pod \"kuadrant-operator-controller-manager-55c7f4c975-zgpsp\" (UID: \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:01:56.648941 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.648922 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-566tj\" (UniqueName: \"kubernetes.io/projected/e5ea4974-b456-45f3-aab4-c493274c53b0-kube-api-access-566tj\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-ws9x8\" (UID: \"e5ea4974-b456-45f3-aab4-c493274c53b0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:01:56.740516 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.740484 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/27e912c6-5fd4-42fa-8cf3-20011b79def9-extensions-socket-volume\") pod \"27e912c6-5fd4-42fa-8cf3-20011b79def9\" (UID: \"27e912c6-5fd4-42fa-8cf3-20011b79def9\") " Apr 20 15:01:56.740714 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.740558 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e4c8b7d-a018-4d08-bd35-863a9cc06581-extensions-socket-volume\") pod \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\" (UID: \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\") " Apr 20 15:01:56.740714 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.740605 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dshrj\" (UniqueName: \"kubernetes.io/projected/8e4c8b7d-a018-4d08-bd35-863a9cc06581-kube-api-access-dshrj\") pod \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\" (UID: \"8e4c8b7d-a018-4d08-bd35-863a9cc06581\") " Apr 20 15:01:56.740890 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.740862 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27e912c6-5fd4-42fa-8cf3-20011b79def9-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "27e912c6-5fd4-42fa-8cf3-20011b79def9" (UID: "27e912c6-5fd4-42fa-8cf3-20011b79def9"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:56.740960 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.740929 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zx8pj\" (UniqueName: \"kubernetes.io/projected/27e912c6-5fd4-42fa-8cf3-20011b79def9-kube-api-access-zx8pj\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:56.740960 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.740946 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/27e912c6-5fd4-42fa-8cf3-20011b79def9-extensions-socket-volume\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:56.741055 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.740989 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e4c8b7d-a018-4d08-bd35-863a9cc06581-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "8e4c8b7d-a018-4d08-bd35-863a9cc06581" (UID: "8e4c8b7d-a018-4d08-bd35-863a9cc06581"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:56.743196 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.743169 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e4c8b7d-a018-4d08-bd35-863a9cc06581-kube-api-access-dshrj" (OuterVolumeSpecName: "kube-api-access-dshrj") pod "8e4c8b7d-a018-4d08-bd35-863a9cc06581" (UID: "8e4c8b7d-a018-4d08-bd35-863a9cc06581"). InnerVolumeSpecName "kube-api-access-dshrj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:56.759507 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.759483 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:01:56.768216 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.768185 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:01:56.842326 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.842294 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dshrj\" (UniqueName: \"kubernetes.io/projected/8e4c8b7d-a018-4d08-bd35-863a9cc06581-kube-api-access-dshrj\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:56.842326 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.842332 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e4c8b7d-a018-4d08-bd35-863a9cc06581-extensions-socket-volume\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:01:56.872270 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.872142 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27e912c6-5fd4-42fa-8cf3-20011b79def9" path="/var/lib/kubelet/pods/27e912c6-5fd4-42fa-8cf3-20011b79def9/volumes" Apr 20 15:01:56.872792 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.872551 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e4c8b7d-a018-4d08-bd35-863a9cc06581" path="/var/lib/kubelet/pods/8e4c8b7d-a018-4d08-bd35-863a9cc06581/volumes" Apr 20 15:01:56.945372 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.945342 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp"] Apr 20 15:01:56.947263 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:56.947241 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc29776ad_873d_4da8_9ea6_fd3cde64ac51.slice/crio-3c93b8e51cb11910df86b0a1f226188a0ad42f46bcf8f1f89d6f5600e230e9ac WatchSource:0}: Error finding container 3c93b8e51cb11910df86b0a1f226188a0ad42f46bcf8f1f89d6f5600e230e9ac: Status 404 returned error can't find the container with id 3c93b8e51cb11910df86b0a1f226188a0ad42f46bcf8f1f89d6f5600e230e9ac Apr 20 15:01:56.953008 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:56.952986 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8"] Apr 20 15:01:56.954688 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:01:56.954665 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ea4974_b456_45f3_aab4_c493274c53b0.slice/crio-48606e8aec1680148217333b2f5ac62757d6162e321b16565fbeaf443fe99330 WatchSource:0}: Error finding container 48606e8aec1680148217333b2f5ac62757d6162e321b16565fbeaf443fe99330: Status 404 returned error can't find the container with id 48606e8aec1680148217333b2f5ac62757d6162e321b16565fbeaf443fe99330 Apr 20 15:01:57.532446 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.532408 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-2hcj5" Apr 20 15:01:57.584804 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.584690 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-k52bw" Apr 20 15:01:57.584804 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.584702 2575 scope.go:117] "RemoveContainer" containerID="9bb513c468b6ad7a96dbdf5beb35cee27547bb0bd8b461a24f9759ba30342157" Apr 20 15:01:57.587769 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.587677 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" event={"ID":"c29776ad-873d-4da8-9ea6-fd3cde64ac51","Type":"ContainerStarted","Data":"465b6ae43d98eba94dd6229f6c0702c3260d9026513467af9131f2d645b796fb"} Apr 20 15:01:57.587769 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.587717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" event={"ID":"c29776ad-873d-4da8-9ea6-fd3cde64ac51","Type":"ContainerStarted","Data":"3c93b8e51cb11910df86b0a1f226188a0ad42f46bcf8f1f89d6f5600e230e9ac"} Apr 20 15:01:57.588391 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.588094 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:01:57.590136 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.590071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" Apr 20 15:01:57.590414 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.590389 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" event={"ID":"e5ea4974-b456-45f3-aab4-c493274c53b0","Type":"ContainerStarted","Data":"d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf"} Apr 20 15:01:57.590493 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.590422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" event={"ID":"e5ea4974-b456-45f3-aab4-c493274c53b0","Type":"ContainerStarted","Data":"48606e8aec1680148217333b2f5ac62757d6162e321b16565fbeaf443fe99330"} Apr 20 15:01:57.590548 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.590514 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:01:57.607866 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.607821 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" podStartSLOduration=1.607805147 podStartE2EDuration="1.607805147s" podCreationTimestamp="2026-04-20 15:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:01:57.606628151 +0000 UTC m=+489.299171678" watchObservedRunningTime="2026-04-20 15:01:57.607805147 +0000 UTC m=+489.300348647" Apr 20 15:01:57.608548 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.608523 2575 status_manager.go:895] "Failed to get status for pod" podUID="27e912c6-5fd4-42fa-8cf3-20011b79def9" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:01:57.629809 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:57.629762 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" podStartSLOduration=1.629748392 podStartE2EDuration="1.629748392s" podCreationTimestamp="2026-04-20 15:01:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:01:57.627345998 +0000 UTC m=+489.319889521" watchObservedRunningTime="2026-04-20 15:01:57.629748392 +0000 UTC m=+489.322291890" Apr 20 15:01:58.869757 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:01:58.869710 2575 status_manager.go:895] "Failed to get status for pod" podUID="27e912c6-5fd4-42fa-8cf3-20011b79def9" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-vgnh2" err="pods \"kuadrant-operator-controller-manager-55c7f4c975-vgnh2\" is forbidden: User \"system:node:ip-10-0-140-93.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-140-93.ec2.internal' and this object" Apr 20 15:02:02.498497 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.498463 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-d74bcc6f7-mfhfl"] Apr 20 15:02:02.654127 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.654093 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d74bcc6f7-mfhfl"] Apr 20 15:02:02.654322 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.654240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.697603 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.697568 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-oauth-serving-cert\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.697781 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.697621 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-trusted-ca-bundle\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.697866 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.697772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/06626336-d19f-4aae-9c9d-1d525e4876fd-console-oauth-config\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.697866 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.697811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-service-ca\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.697962 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.697902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/06626336-d19f-4aae-9c9d-1d525e4876fd-console-serving-cert\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.697962 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.697956 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxmzm\" (UniqueName: \"kubernetes.io/projected/06626336-d19f-4aae-9c9d-1d525e4876fd-kube-api-access-sxmzm\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.698084 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.698001 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-console-config\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.799235 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.799191 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-trusted-ca-bundle\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.799438 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.799289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/06626336-d19f-4aae-9c9d-1d525e4876fd-console-oauth-config\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.799438 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.799312 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-service-ca\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.799438 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.799352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/06626336-d19f-4aae-9c9d-1d525e4876fd-console-serving-cert\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.799438 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.799383 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxmzm\" (UniqueName: \"kubernetes.io/projected/06626336-d19f-4aae-9c9d-1d525e4876fd-kube-api-access-sxmzm\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.799438 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.799419 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-console-config\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.799727 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.799442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-oauth-serving-cert\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.800213 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.800161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-service-ca\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.800213 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.800205 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-console-config\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.800398 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.800218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-trusted-ca-bundle\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.800398 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.800271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/06626336-d19f-4aae-9c9d-1d525e4876fd-oauth-serving-cert\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.802506 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.802487 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/06626336-d19f-4aae-9c9d-1d525e4876fd-console-oauth-config\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.802847 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.802824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/06626336-d19f-4aae-9c9d-1d525e4876fd-console-serving-cert\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.809499 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.809475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxmzm\" (UniqueName: \"kubernetes.io/projected/06626336-d19f-4aae-9c9d-1d525e4876fd-kube-api-access-sxmzm\") pod \"console-d74bcc6f7-mfhfl\" (UID: \"06626336-d19f-4aae-9c9d-1d525e4876fd\") " pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:02.965790 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:02.965569 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:08.597001 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:08.596967 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:02:08.597438 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:08.597112 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:02:08.680693 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:08.680662 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8"] Apr 20 15:02:08.680951 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:08.680924 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" podUID="e5ea4974-b456-45f3-aab4-c493274c53b0" containerName="manager" containerID="cri-o://d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf" gracePeriod=10 Apr 20 15:02:11.835035 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:11.834989 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:02:11.867050 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:11.867006 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d74bcc6f7-mfhfl"] Apr 20 15:02:11.867408 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:02:11.867381 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06626336_d19f_4aae_9c9d_1d525e4876fd.slice/crio-e5233b38fdc676ba7639615967839668c8ee51038851c1efb1ee9a1e2b3eb4aa WatchSource:0}: Error finding container e5233b38fdc676ba7639615967839668c8ee51038851c1efb1ee9a1e2b3eb4aa: Status 404 returned error can't find the container with id e5233b38fdc676ba7639615967839668c8ee51038851c1efb1ee9a1e2b3eb4aa Apr 20 15:02:11.884241 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:11.884219 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5ea4974-b456-45f3-aab4-c493274c53b0-extensions-socket-volume\") pod \"e5ea4974-b456-45f3-aab4-c493274c53b0\" (UID: \"e5ea4974-b456-45f3-aab4-c493274c53b0\") " Apr 20 15:02:11.884363 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:11.884347 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-566tj\" (UniqueName: \"kubernetes.io/projected/e5ea4974-b456-45f3-aab4-c493274c53b0-kube-api-access-566tj\") pod \"e5ea4974-b456-45f3-aab4-c493274c53b0\" (UID: \"e5ea4974-b456-45f3-aab4-c493274c53b0\") " Apr 20 15:02:11.884677 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:11.884635 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5ea4974-b456-45f3-aab4-c493274c53b0-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e5ea4974-b456-45f3-aab4-c493274c53b0" (UID: "e5ea4974-b456-45f3-aab4-c493274c53b0"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:02:11.886517 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:11.886494 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5ea4974-b456-45f3-aab4-c493274c53b0-kube-api-access-566tj" (OuterVolumeSpecName: "kube-api-access-566tj") pod "e5ea4974-b456-45f3-aab4-c493274c53b0" (UID: "e5ea4974-b456-45f3-aab4-c493274c53b0"). InnerVolumeSpecName "kube-api-access-566tj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:02:11.985954 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:11.985865 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-566tj\" (UniqueName: \"kubernetes.io/projected/e5ea4974-b456-45f3-aab4-c493274c53b0-kube-api-access-566tj\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:02:11.985954 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:11.985898 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e5ea4974-b456-45f3-aab4-c493274c53b0-extensions-socket-volume\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:02:12.657513 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.657473 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d74bcc6f7-mfhfl" event={"ID":"06626336-d19f-4aae-9c9d-1d525e4876fd","Type":"ContainerStarted","Data":"b5acae1eb19e798e47f562f2836f23d78df2d9e526c8cde97292d966e36f7bbe"} Apr 20 15:02:12.657513 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.657513 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d74bcc6f7-mfhfl" event={"ID":"06626336-d19f-4aae-9c9d-1d525e4876fd","Type":"ContainerStarted","Data":"e5233b38fdc676ba7639615967839668c8ee51038851c1efb1ee9a1e2b3eb4aa"} Apr 20 15:02:12.662647 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.662611 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" event={"ID":"97f7f61e-7044-4647-b569-38e5b28ec72c","Type":"ContainerStarted","Data":"1269629b2a860deae3254f184975166588942b929c84010a73d395b33330161f"} Apr 20 15:02:12.663724 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.663702 2575 generic.go:358] "Generic (PLEG): container finished" podID="e5ea4974-b456-45f3-aab4-c493274c53b0" containerID="d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf" exitCode=0 Apr 20 15:02:12.663828 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.663750 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" Apr 20 15:02:12.663828 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.663786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" event={"ID":"e5ea4974-b456-45f3-aab4-c493274c53b0","Type":"ContainerDied","Data":"d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf"} Apr 20 15:02:12.663828 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.663815 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8" event={"ID":"e5ea4974-b456-45f3-aab4-c493274c53b0","Type":"ContainerDied","Data":"48606e8aec1680148217333b2f5ac62757d6162e321b16565fbeaf443fe99330"} Apr 20 15:02:12.663928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.663834 2575 scope.go:117] "RemoveContainer" containerID="d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf" Apr 20 15:02:12.672630 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.672615 2575 scope.go:117] "RemoveContainer" containerID="d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf" Apr 20 15:02:12.672859 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:02:12.672844 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf\": container with ID starting with d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf not found: ID does not exist" containerID="d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf" Apr 20 15:02:12.672895 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.672868 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf"} err="failed to get container status \"d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf\": rpc error: code = NotFound desc = could not find container \"d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf\": container with ID starting with d4b56dc2a0a3c2d51b3a67032428fd954beb6f70cef170c12da8eb73ec49edbf not found: ID does not exist" Apr 20 15:02:12.675784 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.675748 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d74bcc6f7-mfhfl" podStartSLOduration=10.675737198 podStartE2EDuration="10.675737198s" podCreationTimestamp="2026-04-20 15:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:12.674110444 +0000 UTC m=+504.366653948" watchObservedRunningTime="2026-04-20 15:02:12.675737198 +0000 UTC m=+504.368280697" Apr 20 15:02:12.690415 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.690392 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8"] Apr 20 15:02:12.694252 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.694232 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-ws9x8"] Apr 20 15:02:12.710471 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.710436 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-qmdj7" podStartSLOduration=1.5336701590000001 podStartE2EDuration="26.710424435s" podCreationTimestamp="2026-04-20 15:01:46 +0000 UTC" firstStartedPulling="2026-04-20 15:01:46.582367314 +0000 UTC m=+478.274910809" lastFinishedPulling="2026-04-20 15:02:11.759121593 +0000 UTC m=+503.451665085" observedRunningTime="2026-04-20 15:02:12.708307081 +0000 UTC m=+504.400850581" watchObservedRunningTime="2026-04-20 15:02:12.710424435 +0000 UTC m=+504.402967934" Apr 20 15:02:12.868895 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.868863 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5ea4974-b456-45f3-aab4-c493274c53b0" path="/var/lib/kubelet/pods/e5ea4974-b456-45f3-aab4-c493274c53b0/volumes" Apr 20 15:02:12.966720 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.966634 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:12.966720 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.966674 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:12.971364 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:12.971339 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:13.672405 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:13.672378 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d74bcc6f7-mfhfl" Apr 20 15:02:13.722621 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:13.722587 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fb5cd5f68-msfxx"] Apr 20 15:02:25.013438 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.013396 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw"] Apr 20 15:02:25.013999 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.013839 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5ea4974-b456-45f3-aab4-c493274c53b0" containerName="manager" Apr 20 15:02:25.013999 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.013855 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5ea4974-b456-45f3-aab4-c493274c53b0" containerName="manager" Apr 20 15:02:25.013999 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.013937 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5ea4974-b456-45f3-aab4-c493274c53b0" containerName="manager" Apr 20 15:02:25.017548 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.017516 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.020009 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.019978 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-n7qvt\"" Apr 20 15:02:25.029672 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.029648 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw"] Apr 20 15:02:25.101239 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.101205 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.101394 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.101244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8101843f-59d4-44da-8496-cc21d3abb48b-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.101394 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.101274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.101394 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.101359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.101512 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.101442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8101843f-59d4-44da-8496-cc21d3abb48b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.101512 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.101467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.101512 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.101501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.101642 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.101573 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbp25\" (UniqueName: \"kubernetes.io/projected/8101843f-59d4-44da-8496-cc21d3abb48b-kube-api-access-fbp25\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.101642 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.101620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8101843f-59d4-44da-8496-cc21d3abb48b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203075 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203266 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203266 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8101843f-59d4-44da-8496-cc21d3abb48b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203266 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbp25\" (UniqueName: \"kubernetes.io/projected/8101843f-59d4-44da-8496-cc21d3abb48b-kube-api-access-fbp25\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203438 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8101843f-59d4-44da-8496-cc21d3abb48b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203664 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203664 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8101843f-59d4-44da-8496-cc21d3abb48b-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203664 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203554 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203857 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203912 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203889 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8101843f-59d4-44da-8496-cc21d3abb48b-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.203967 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.203952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.205512 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.205480 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8101843f-59d4-44da-8496-cc21d3abb48b-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.205872 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.205825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8101843f-59d4-44da-8496-cc21d3abb48b-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.211339 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.211315 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8101843f-59d4-44da-8496-cc21d3abb48b-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.211566 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.211549 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbp25\" (UniqueName: \"kubernetes.io/projected/8101843f-59d4-44da-8496-cc21d3abb48b-kube-api-access-fbp25\") pod \"maas-default-gateway-openshift-default-845c6b4b48-zwxkw\" (UID: \"8101843f-59d4-44da-8496-cc21d3abb48b\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.329887 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.329857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:25.458607 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.458580 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw"] Apr 20 15:02:25.460060 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:02:25.460034 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8101843f_59d4_44da_8496_cc21d3abb48b.slice/crio-ab6f709daca3eea4cd723fc63ce72560780135f918cf7000e24c78fea556ca6b WatchSource:0}: Error finding container ab6f709daca3eea4cd723fc63ce72560780135f918cf7000e24c78fea556ca6b: Status 404 returned error can't find the container with id ab6f709daca3eea4cd723fc63ce72560780135f918cf7000e24c78fea556ca6b Apr 20 15:02:25.462201 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.462160 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 15:02:25.462285 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.462238 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 15:02:25.462285 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.462268 2575 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 20 15:02:25.719906 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.719819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" event={"ID":"8101843f-59d4-44da-8496-cc21d3abb48b","Type":"ContainerStarted","Data":"994cf1942d6af82125387411857433d88cc9f5dba183173181a5536eaa0977c8"} Apr 20 15:02:25.719906 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.719858 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" event={"ID":"8101843f-59d4-44da-8496-cc21d3abb48b","Type":"ContainerStarted","Data":"ab6f709daca3eea4cd723fc63ce72560780135f918cf7000e24c78fea556ca6b"} Apr 20 15:02:25.737839 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:25.737795 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" podStartSLOduration=1.737782025 podStartE2EDuration="1.737782025s" podCreationTimestamp="2026-04-20 15:02:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:02:25.736577612 +0000 UTC m=+517.429121123" watchObservedRunningTime="2026-04-20 15:02:25.737782025 +0000 UTC m=+517.430325523" Apr 20 15:02:26.330869 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:26.330834 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:26.335744 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:26.335720 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:26.723951 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:26.723873 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:26.724858 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:26.724837 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-zwxkw" Apr 20 15:02:29.202562 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.202524 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:02:29.216368 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.216341 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:02:29.216507 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.216480 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:02:29.218742 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.218719 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 15:02:29.225269 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.225233 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:02:29.338636 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.338600 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrg7q\" (UniqueName: \"kubernetes.io/projected/f4325a55-6660-47f1-b8a8-be49d04f9f2e-kube-api-access-xrg7q\") pod \"limitador-limitador-78c99df468-clp8b\" (UID: \"f4325a55-6660-47f1-b8a8-be49d04f9f2e\") " pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:02:29.338820 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.338668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f4325a55-6660-47f1-b8a8-be49d04f9f2e-config-file\") pod \"limitador-limitador-78c99df468-clp8b\" (UID: \"f4325a55-6660-47f1-b8a8-be49d04f9f2e\") " pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:02:29.439909 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.439874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrg7q\" (UniqueName: \"kubernetes.io/projected/f4325a55-6660-47f1-b8a8-be49d04f9f2e-kube-api-access-xrg7q\") pod \"limitador-limitador-78c99df468-clp8b\" (UID: \"f4325a55-6660-47f1-b8a8-be49d04f9f2e\") " pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:02:29.440107 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.439928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f4325a55-6660-47f1-b8a8-be49d04f9f2e-config-file\") pod \"limitador-limitador-78c99df468-clp8b\" (UID: \"f4325a55-6660-47f1-b8a8-be49d04f9f2e\") " pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:02:29.440495 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.440476 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/f4325a55-6660-47f1-b8a8-be49d04f9f2e-config-file\") pod \"limitador-limitador-78c99df468-clp8b\" (UID: \"f4325a55-6660-47f1-b8a8-be49d04f9f2e\") " pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:02:29.448119 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.448096 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrg7q\" (UniqueName: \"kubernetes.io/projected/f4325a55-6660-47f1-b8a8-be49d04f9f2e-kube-api-access-xrg7q\") pod \"limitador-limitador-78c99df468-clp8b\" (UID: \"f4325a55-6660-47f1-b8a8-be49d04f9f2e\") " pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:02:29.529481 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.529447 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:02:29.860819 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:02:29.860776 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4325a55_6660_47f1_b8a8_be49d04f9f2e.slice/crio-dc18d7adc02fc178c46cac3fde14e97a44229b50b4dae33c779095612d85cd5a WatchSource:0}: Error finding container dc18d7adc02fc178c46cac3fde14e97a44229b50b4dae33c779095612d85cd5a: Status 404 returned error can't find the container with id dc18d7adc02fc178c46cac3fde14e97a44229b50b4dae33c779095612d85cd5a Apr 20 15:02:29.861629 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.861600 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:02:29.896147 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.896122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-hsgwn"] Apr 20 15:02:29.899178 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.899161 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hsgwn" Apr 20 15:02:29.901919 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.901872 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-jv4pj\"" Apr 20 15:02:29.904931 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:29.904910 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-hsgwn"] Apr 20 15:02:30.046185 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:30.046150 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzzhl\" (UniqueName: \"kubernetes.io/projected/029e76e5-42b7-45f9-ba79-f5049719c775-kube-api-access-dzzhl\") pod \"authorino-7498df8756-hsgwn\" (UID: \"029e76e5-42b7-45f9-ba79-f5049719c775\") " pod="kuadrant-system/authorino-7498df8756-hsgwn" Apr 20 15:02:30.147530 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:30.147446 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzzhl\" (UniqueName: \"kubernetes.io/projected/029e76e5-42b7-45f9-ba79-f5049719c775-kube-api-access-dzzhl\") pod \"authorino-7498df8756-hsgwn\" (UID: \"029e76e5-42b7-45f9-ba79-f5049719c775\") " pod="kuadrant-system/authorino-7498df8756-hsgwn" Apr 20 15:02:30.154908 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:30.154885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzzhl\" (UniqueName: \"kubernetes.io/projected/029e76e5-42b7-45f9-ba79-f5049719c775-kube-api-access-dzzhl\") pod \"authorino-7498df8756-hsgwn\" (UID: \"029e76e5-42b7-45f9-ba79-f5049719c775\") " pod="kuadrant-system/authorino-7498df8756-hsgwn" Apr 20 15:02:30.209589 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:30.209552 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hsgwn" Apr 20 15:02:30.337279 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:30.337251 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-hsgwn"] Apr 20 15:02:30.338920 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:02:30.338893 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod029e76e5_42b7_45f9_ba79_f5049719c775.slice/crio-eaff34dd3996b1da07253b73f286916c9f91c564a3a9c8ee76c5a2e7ba59e69a WatchSource:0}: Error finding container eaff34dd3996b1da07253b73f286916c9f91c564a3a9c8ee76c5a2e7ba59e69a: Status 404 returned error can't find the container with id eaff34dd3996b1da07253b73f286916c9f91c564a3a9c8ee76c5a2e7ba59e69a Apr 20 15:02:30.740711 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:30.740659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hsgwn" event={"ID":"029e76e5-42b7-45f9-ba79-f5049719c775","Type":"ContainerStarted","Data":"eaff34dd3996b1da07253b73f286916c9f91c564a3a9c8ee76c5a2e7ba59e69a"} Apr 20 15:02:30.741997 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:30.741969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" event={"ID":"f4325a55-6660-47f1-b8a8-be49d04f9f2e","Type":"ContainerStarted","Data":"dc18d7adc02fc178c46cac3fde14e97a44229b50b4dae33c779095612d85cd5a"} Apr 20 15:02:34.768749 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:34.768711 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hsgwn" event={"ID":"029e76e5-42b7-45f9-ba79-f5049719c775","Type":"ContainerStarted","Data":"a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2"} Apr 20 15:02:34.770208 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:34.770179 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" event={"ID":"f4325a55-6660-47f1-b8a8-be49d04f9f2e","Type":"ContainerStarted","Data":"a5d1002cdc97cab77998be834afbe4e6ec462ac4e197ed5befef4b1442e147bb"} Apr 20 15:02:34.770342 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:34.770230 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:02:34.783859 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:34.783814 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-hsgwn" podStartSLOduration=2.421278243 podStartE2EDuration="5.783801968s" podCreationTimestamp="2026-04-20 15:02:29 +0000 UTC" firstStartedPulling="2026-04-20 15:02:30.34090211 +0000 UTC m=+522.033445602" lastFinishedPulling="2026-04-20 15:02:33.703425843 +0000 UTC m=+525.395969327" observedRunningTime="2026-04-20 15:02:34.782161164 +0000 UTC m=+526.474704662" watchObservedRunningTime="2026-04-20 15:02:34.783801968 +0000 UTC m=+526.476345488" Apr 20 15:02:34.800566 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:34.800522 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" podStartSLOduration=1.718738512 podStartE2EDuration="5.800509751s" podCreationTimestamp="2026-04-20 15:02:29 +0000 UTC" firstStartedPulling="2026-04-20 15:02:29.862766952 +0000 UTC m=+521.555310432" lastFinishedPulling="2026-04-20 15:02:33.944538193 +0000 UTC m=+525.637081671" observedRunningTime="2026-04-20 15:02:34.797318883 +0000 UTC m=+526.489862394" watchObservedRunningTime="2026-04-20 15:02:34.800509751 +0000 UTC m=+526.493053269" Apr 20 15:02:38.743318 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:38.743279 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6fb5cd5f68-msfxx" podUID="b13f18be-43ba-486b-9f7d-c6bb7f4b2688" containerName="console" containerID="cri-o://b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9" gracePeriod=15 Apr 20 15:02:39.543450 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.543424 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fb5cd5f68-msfxx_b13f18be-43ba-486b-9f7d-c6bb7f4b2688/console/0.log" Apr 20 15:02:39.543571 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.543484 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 15:02:39.640286 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.640253 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxtll\" (UniqueName: \"kubernetes.io/projected/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-kube-api-access-zxtll\") pod \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " Apr 20 15:02:39.640542 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.640308 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-serving-cert\") pod \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " Apr 20 15:02:39.640542 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.640357 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-oauth-config\") pod \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " Apr 20 15:02:39.640542 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.640460 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-oauth-serving-cert\") pod \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " Apr 20 15:02:39.640542 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.640528 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-trusted-ca-bundle\") pod \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " Apr 20 15:02:39.640853 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.640554 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-config\") pod \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " Apr 20 15:02:39.640853 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.640581 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-service-ca\") pod \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\" (UID: \"b13f18be-43ba-486b-9f7d-c6bb7f4b2688\") " Apr 20 15:02:39.641005 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.640844 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b13f18be-43ba-486b-9f7d-c6bb7f4b2688" (UID: "b13f18be-43ba-486b-9f7d-c6bb7f4b2688"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:02:39.641005 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.640950 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-config" (OuterVolumeSpecName: "console-config") pod "b13f18be-43ba-486b-9f7d-c6bb7f4b2688" (UID: "b13f18be-43ba-486b-9f7d-c6bb7f4b2688"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:02:39.641271 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.641243 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-service-ca" (OuterVolumeSpecName: "service-ca") pod "b13f18be-43ba-486b-9f7d-c6bb7f4b2688" (UID: "b13f18be-43ba-486b-9f7d-c6bb7f4b2688"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:02:39.641630 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.641604 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b13f18be-43ba-486b-9f7d-c6bb7f4b2688" (UID: "b13f18be-43ba-486b-9f7d-c6bb7f4b2688"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:02:39.642871 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.642842 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b13f18be-43ba-486b-9f7d-c6bb7f4b2688" (UID: "b13f18be-43ba-486b-9f7d-c6bb7f4b2688"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:02:39.643279 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.643250 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b13f18be-43ba-486b-9f7d-c6bb7f4b2688" (UID: "b13f18be-43ba-486b-9f7d-c6bb7f4b2688"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:02:39.643385 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.643285 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-kube-api-access-zxtll" (OuterVolumeSpecName: "kube-api-access-zxtll") pod "b13f18be-43ba-486b-9f7d-c6bb7f4b2688" (UID: "b13f18be-43ba-486b-9f7d-c6bb7f4b2688"). InnerVolumeSpecName "kube-api-access-zxtll". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:02:39.741928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.741857 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-oauth-serving-cert\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:02:39.741928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.741883 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-trusted-ca-bundle\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:02:39.741928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.741893 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-config\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:02:39.741928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.741902 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-service-ca\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:02:39.741928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.741910 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zxtll\" (UniqueName: \"kubernetes.io/projected/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-kube-api-access-zxtll\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:02:39.741928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.741919 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-serving-cert\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:02:39.741928 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.741928 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b13f18be-43ba-486b-9f7d-c6bb7f4b2688-console-oauth-config\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:02:39.792012 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.791979 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6fb5cd5f68-msfxx_b13f18be-43ba-486b-9f7d-c6bb7f4b2688/console/0.log" Apr 20 15:02:39.792454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.792057 2575 generic.go:358] "Generic (PLEG): container finished" podID="b13f18be-43ba-486b-9f7d-c6bb7f4b2688" containerID="b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9" exitCode=2 Apr 20 15:02:39.792454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.792145 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb5cd5f68-msfxx" event={"ID":"b13f18be-43ba-486b-9f7d-c6bb7f4b2688","Type":"ContainerDied","Data":"b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9"} Apr 20 15:02:39.792454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.792187 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fb5cd5f68-msfxx" event={"ID":"b13f18be-43ba-486b-9f7d-c6bb7f4b2688","Type":"ContainerDied","Data":"4e02144d43272276c523c2e49e02dc8f9488b10171aa47615cb6239e743691ba"} Apr 20 15:02:39.792454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.792202 2575 scope.go:117] "RemoveContainer" containerID="b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9" Apr 20 15:02:39.792454 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.792156 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fb5cd5f68-msfxx" Apr 20 15:02:39.805283 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.803443 2575 scope.go:117] "RemoveContainer" containerID="b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9" Apr 20 15:02:39.805693 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:02:39.805672 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9\": container with ID starting with b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9 not found: ID does not exist" containerID="b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9" Apr 20 15:02:39.805795 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.805700 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9"} err="failed to get container status \"b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9\": rpc error: code = NotFound desc = could not find container \"b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9\": container with ID starting with b7d1f85a25172e7c351829f7d40dd40b2a934f30a4b3c79a64299944f502a0b9 not found: ID does not exist" Apr 20 15:02:39.818032 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.817924 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6fb5cd5f68-msfxx"] Apr 20 15:02:39.823527 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:39.823501 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6fb5cd5f68-msfxx"] Apr 20 15:02:40.869260 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:40.869225 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b13f18be-43ba-486b-9f7d-c6bb7f4b2688" path="/var/lib/kubelet/pods/b13f18be-43ba-486b-9f7d-c6bb7f4b2688/volumes" Apr 20 15:02:45.775462 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:02:45.775433 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-clp8b" Apr 20 15:03:03.491160 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.491118 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f5bfcdffd-9qjk2"] Apr 20 15:03:03.492566 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.492510 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b13f18be-43ba-486b-9f7d-c6bb7f4b2688" containerName="console" Apr 20 15:03:03.492826 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.492808 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="b13f18be-43ba-486b-9f7d-c6bb7f4b2688" containerName="console" Apr 20 15:03:03.493379 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.493354 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="b13f18be-43ba-486b-9f7d-c6bb7f4b2688" containerName="console" Apr 20 15:03:03.495828 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.495795 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:03:03.497436 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.497414 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f5bfcdffd-9qjk2"] Apr 20 15:03:03.500467 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.500446 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 15:03:03.540174 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.540148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q862d\" (UniqueName: \"kubernetes.io/projected/e1de48a6-1f4b-423c-876a-b2151c966d67-kube-api-access-q862d\") pod \"authorino-f5bfcdffd-9qjk2\" (UID: \"e1de48a6-1f4b-423c-876a-b2151c966d67\") " pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:03:03.540274 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.540200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e1de48a6-1f4b-423c-876a-b2151c966d67-tls-cert\") pod \"authorino-f5bfcdffd-9qjk2\" (UID: \"e1de48a6-1f4b-423c-876a-b2151c966d67\") " pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:03:03.641583 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.641554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q862d\" (UniqueName: \"kubernetes.io/projected/e1de48a6-1f4b-423c-876a-b2151c966d67-kube-api-access-q862d\") pod \"authorino-f5bfcdffd-9qjk2\" (UID: \"e1de48a6-1f4b-423c-876a-b2151c966d67\") " pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:03:03.641704 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.641599 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e1de48a6-1f4b-423c-876a-b2151c966d67-tls-cert\") pod \"authorino-f5bfcdffd-9qjk2\" (UID: \"e1de48a6-1f4b-423c-876a-b2151c966d67\") " pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:03:03.644111 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.644087 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e1de48a6-1f4b-423c-876a-b2151c966d67-tls-cert\") pod \"authorino-f5bfcdffd-9qjk2\" (UID: \"e1de48a6-1f4b-423c-876a-b2151c966d67\") " pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:03:03.649336 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.649313 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q862d\" (UniqueName: \"kubernetes.io/projected/e1de48a6-1f4b-423c-876a-b2151c966d67-kube-api-access-q862d\") pod \"authorino-f5bfcdffd-9qjk2\" (UID: \"e1de48a6-1f4b-423c-876a-b2151c966d67\") " pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:03:03.806094 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.806066 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:03:03.932315 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:03.932291 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f5bfcdffd-9qjk2"] Apr 20 15:03:03.934254 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:03:03.934219 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1de48a6_1f4b_423c_876a_b2151c966d67.slice/crio-60a4fb6ecc8bf648830bcca316848ca25daee49656a4ac63f9d9f370ba476573 WatchSource:0}: Error finding container 60a4fb6ecc8bf648830bcca316848ca25daee49656a4ac63f9d9f370ba476573: Status 404 returned error can't find the container with id 60a4fb6ecc8bf648830bcca316848ca25daee49656a4ac63f9d9f370ba476573 Apr 20 15:03:04.899814 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:04.899772 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" event={"ID":"e1de48a6-1f4b-423c-876a-b2151c966d67","Type":"ContainerStarted","Data":"2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1"} Apr 20 15:03:04.900216 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:04.899819 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" event={"ID":"e1de48a6-1f4b-423c-876a-b2151c966d67","Type":"ContainerStarted","Data":"60a4fb6ecc8bf648830bcca316848ca25daee49656a4ac63f9d9f370ba476573"} Apr 20 15:03:04.914881 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:04.914834 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" podStartSLOduration=1.394096605 podStartE2EDuration="1.914818695s" podCreationTimestamp="2026-04-20 15:03:03 +0000 UTC" firstStartedPulling="2026-04-20 15:03:03.935486073 +0000 UTC m=+555.628029553" lastFinishedPulling="2026-04-20 15:03:04.456208167 +0000 UTC m=+556.148751643" observedRunningTime="2026-04-20 15:03:04.913194157 +0000 UTC m=+556.605737656" watchObservedRunningTime="2026-04-20 15:03:04.914818695 +0000 UTC m=+556.607362193" Apr 20 15:03:04.939362 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:04.939329 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-hsgwn"] Apr 20 15:03:04.939554 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:04.939533 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-hsgwn" podUID="029e76e5-42b7-45f9-ba79-f5049719c775" containerName="authorino" containerID="cri-o://a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2" gracePeriod=30 Apr 20 15:03:05.218568 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.218545 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hsgwn" Apr 20 15:03:05.256975 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.256950 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzzhl\" (UniqueName: \"kubernetes.io/projected/029e76e5-42b7-45f9-ba79-f5049719c775-kube-api-access-dzzhl\") pod \"029e76e5-42b7-45f9-ba79-f5049719c775\" (UID: \"029e76e5-42b7-45f9-ba79-f5049719c775\") " Apr 20 15:03:05.259487 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.259463 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/029e76e5-42b7-45f9-ba79-f5049719c775-kube-api-access-dzzhl" (OuterVolumeSpecName: "kube-api-access-dzzhl") pod "029e76e5-42b7-45f9-ba79-f5049719c775" (UID: "029e76e5-42b7-45f9-ba79-f5049719c775"). InnerVolumeSpecName "kube-api-access-dzzhl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:05.357699 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.357670 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dzzhl\" (UniqueName: \"kubernetes.io/projected/029e76e5-42b7-45f9-ba79-f5049719c775-kube-api-access-dzzhl\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:03:05.904430 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.904392 2575 generic.go:358] "Generic (PLEG): container finished" podID="029e76e5-42b7-45f9-ba79-f5049719c775" containerID="a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2" exitCode=0 Apr 20 15:03:05.904919 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.904444 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-hsgwn" Apr 20 15:03:05.904919 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.904478 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hsgwn" event={"ID":"029e76e5-42b7-45f9-ba79-f5049719c775","Type":"ContainerDied","Data":"a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2"} Apr 20 15:03:05.904919 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.904519 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-hsgwn" event={"ID":"029e76e5-42b7-45f9-ba79-f5049719c775","Type":"ContainerDied","Data":"eaff34dd3996b1da07253b73f286916c9f91c564a3a9c8ee76c5a2e7ba59e69a"} Apr 20 15:03:05.904919 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.904540 2575 scope.go:117] "RemoveContainer" containerID="a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2" Apr 20 15:03:05.913567 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.913551 2575 scope.go:117] "RemoveContainer" containerID="a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2" Apr 20 15:03:05.913833 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:03:05.913810 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2\": container with ID starting with a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2 not found: ID does not exist" containerID="a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2" Apr 20 15:03:05.913891 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.913844 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2"} err="failed to get container status \"a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2\": rpc error: code = NotFound desc = could not find container \"a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2\": container with ID starting with a07e50c12dbc5ff5b1bff521fc21c12a81bc5197891d26106c5aae8b319205a2 not found: ID does not exist" Apr 20 15:03:05.927444 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.927416 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-hsgwn"] Apr 20 15:03:05.930973 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:05.930952 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-hsgwn"] Apr 20 15:03:06.870034 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:06.869987 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="029e76e5-42b7-45f9-ba79-f5049719c775" path="/var/lib/kubelet/pods/029e76e5-42b7-45f9-ba79-f5049719c775/volumes" Apr 20 15:03:29.816209 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:29.816171 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:03:48.791474 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:48.791442 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:03:48.792112 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:48.791768 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:03:56.447811 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:03:56.447767 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:04:03.241224 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:04:03.241147 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:04:12.830183 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:04:12.830145 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:04:26.634836 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:04:26.634791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:04:34.533206 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:04:34.533169 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:04:48.228237 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:04:48.228194 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:05:18.231116 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.231077 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-d49cd9b47-428ld"] Apr 20 15:05:18.231632 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.231468 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="029e76e5-42b7-45f9-ba79-f5049719c775" containerName="authorino" Apr 20 15:05:18.231632 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.231479 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="029e76e5-42b7-45f9-ba79-f5049719c775" containerName="authorino" Apr 20 15:05:18.231632 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.231546 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="029e76e5-42b7-45f9-ba79-f5049719c775" containerName="authorino" Apr 20 15:05:18.233492 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.233476 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-d49cd9b47-428ld" Apr 20 15:05:18.241547 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.241524 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-d49cd9b47-428ld"] Apr 20 15:05:18.358910 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.358879 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgkzt\" (UniqueName: \"kubernetes.io/projected/b3053491-a607-482c-8f52-67f92d2191ef-kube-api-access-bgkzt\") pod \"authorino-d49cd9b47-428ld\" (UID: \"b3053491-a607-482c-8f52-67f92d2191ef\") " pod="kuadrant-system/authorino-d49cd9b47-428ld" Apr 20 15:05:18.359110 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.358930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3053491-a607-482c-8f52-67f92d2191ef-tls-cert\") pod \"authorino-d49cd9b47-428ld\" (UID: \"b3053491-a607-482c-8f52-67f92d2191ef\") " pod="kuadrant-system/authorino-d49cd9b47-428ld" Apr 20 15:05:18.460329 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.460291 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgkzt\" (UniqueName: \"kubernetes.io/projected/b3053491-a607-482c-8f52-67f92d2191ef-kube-api-access-bgkzt\") pod \"authorino-d49cd9b47-428ld\" (UID: \"b3053491-a607-482c-8f52-67f92d2191ef\") " pod="kuadrant-system/authorino-d49cd9b47-428ld" Apr 20 15:05:18.460528 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.460346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3053491-a607-482c-8f52-67f92d2191ef-tls-cert\") pod \"authorino-d49cd9b47-428ld\" (UID: \"b3053491-a607-482c-8f52-67f92d2191ef\") " pod="kuadrant-system/authorino-d49cd9b47-428ld" Apr 20 15:05:18.462899 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.462872 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3053491-a607-482c-8f52-67f92d2191ef-tls-cert\") pod \"authorino-d49cd9b47-428ld\" (UID: \"b3053491-a607-482c-8f52-67f92d2191ef\") " pod="kuadrant-system/authorino-d49cd9b47-428ld" Apr 20 15:05:18.467850 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.467830 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgkzt\" (UniqueName: \"kubernetes.io/projected/b3053491-a607-482c-8f52-67f92d2191ef-kube-api-access-bgkzt\") pod \"authorino-d49cd9b47-428ld\" (UID: \"b3053491-a607-482c-8f52-67f92d2191ef\") " pod="kuadrant-system/authorino-d49cd9b47-428ld" Apr 20 15:05:18.544383 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.544354 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-d49cd9b47-428ld" Apr 20 15:05:18.671666 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.671627 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-d49cd9b47-428ld"] Apr 20 15:05:18.673148 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:05:18.673123 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3053491_a607_482c_8f52_67f92d2191ef.slice/crio-3f3c89f73b6a3bff7c37a923b8f974d1617e0689a746bfa2b5670dc4d2866dcd WatchSource:0}: Error finding container 3f3c89f73b6a3bff7c37a923b8f974d1617e0689a746bfa2b5670dc4d2866dcd: Status 404 returned error can't find the container with id 3f3c89f73b6a3bff7c37a923b8f974d1617e0689a746bfa2b5670dc4d2866dcd Apr 20 15:05:18.674552 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:18.674525 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:05:19.436148 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.436060 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-d49cd9b47-428ld" event={"ID":"b3053491-a607-482c-8f52-67f92d2191ef","Type":"ContainerStarted","Data":"d586eeab48ca3ffe0293b153f43b32bbcd64b122aa2239ec525e8f2cf84c08d1"} Apr 20 15:05:19.436148 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.436108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-d49cd9b47-428ld" event={"ID":"b3053491-a607-482c-8f52-67f92d2191ef","Type":"ContainerStarted","Data":"3f3c89f73b6a3bff7c37a923b8f974d1617e0689a746bfa2b5670dc4d2866dcd"} Apr 20 15:05:19.452668 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.452602 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-d49cd9b47-428ld" podStartSLOduration=1.032539305 podStartE2EDuration="1.452583391s" podCreationTimestamp="2026-04-20 15:05:18 +0000 UTC" firstStartedPulling="2026-04-20 15:05:18.674696807 +0000 UTC m=+690.367240286" lastFinishedPulling="2026-04-20 15:05:19.094740896 +0000 UTC m=+690.787284372" observedRunningTime="2026-04-20 15:05:19.449274523 +0000 UTC m=+691.141818020" watchObservedRunningTime="2026-04-20 15:05:19.452583391 +0000 UTC m=+691.145126892" Apr 20 15:05:19.482104 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.480007 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f5bfcdffd-9qjk2"] Apr 20 15:05:19.482104 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.480667 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" podUID="e1de48a6-1f4b-423c-876a-b2151c966d67" containerName="authorino" containerID="cri-o://2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1" gracePeriod=30 Apr 20 15:05:19.726137 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.726114 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:05:19.774083 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.774052 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e1de48a6-1f4b-423c-876a-b2151c966d67-tls-cert\") pod \"e1de48a6-1f4b-423c-876a-b2151c966d67\" (UID: \"e1de48a6-1f4b-423c-876a-b2151c966d67\") " Apr 20 15:05:19.774281 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.774169 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q862d\" (UniqueName: \"kubernetes.io/projected/e1de48a6-1f4b-423c-876a-b2151c966d67-kube-api-access-q862d\") pod \"e1de48a6-1f4b-423c-876a-b2151c966d67\" (UID: \"e1de48a6-1f4b-423c-876a-b2151c966d67\") " Apr 20 15:05:19.776455 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.776418 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1de48a6-1f4b-423c-876a-b2151c966d67-kube-api-access-q862d" (OuterVolumeSpecName: "kube-api-access-q862d") pod "e1de48a6-1f4b-423c-876a-b2151c966d67" (UID: "e1de48a6-1f4b-423c-876a-b2151c966d67"). InnerVolumeSpecName "kube-api-access-q862d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:05:19.784901 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.784870 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1de48a6-1f4b-423c-876a-b2151c966d67-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "e1de48a6-1f4b-423c-876a-b2151c966d67" (UID: "e1de48a6-1f4b-423c-876a-b2151c966d67"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:05:19.875693 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.875654 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q862d\" (UniqueName: \"kubernetes.io/projected/e1de48a6-1f4b-423c-876a-b2151c966d67-kube-api-access-q862d\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:05:19.875693 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:19.875688 2575 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e1de48a6-1f4b-423c-876a-b2151c966d67-tls-cert\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:05:20.442008 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.441977 2575 generic.go:358] "Generic (PLEG): container finished" podID="e1de48a6-1f4b-423c-876a-b2151c966d67" containerID="2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1" exitCode=0 Apr 20 15:05:20.442499 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.442044 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" Apr 20 15:05:20.442499 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.442055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" event={"ID":"e1de48a6-1f4b-423c-876a-b2151c966d67","Type":"ContainerDied","Data":"2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1"} Apr 20 15:05:20.442499 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.442093 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f5bfcdffd-9qjk2" event={"ID":"e1de48a6-1f4b-423c-876a-b2151c966d67","Type":"ContainerDied","Data":"60a4fb6ecc8bf648830bcca316848ca25daee49656a4ac63f9d9f370ba476573"} Apr 20 15:05:20.442499 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.442109 2575 scope.go:117] "RemoveContainer" containerID="2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1" Apr 20 15:05:20.451362 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.451342 2575 scope.go:117] "RemoveContainer" containerID="2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1" Apr 20 15:05:20.451616 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:05:20.451597 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1\": container with ID starting with 2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1 not found: ID does not exist" containerID="2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1" Apr 20 15:05:20.451703 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.451622 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1"} err="failed to get container status \"2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1\": rpc error: code = NotFound desc = could not find container \"2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1\": container with ID starting with 2262e9b666aecfe3f4962fb1201f2b594ab2c25892a8976c1a3ab6181ed478b1 not found: ID does not exist" Apr 20 15:05:20.467411 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.467386 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f5bfcdffd-9qjk2"] Apr 20 15:05:20.470913 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.470887 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f5bfcdffd-9qjk2"] Apr 20 15:05:20.868935 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:20.868898 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1de48a6-1f4b-423c-876a-b2151c966d67" path="/var/lib/kubelet/pods/e1de48a6-1f4b-423c-876a-b2151c966d67/volumes" Apr 20 15:05:39.129430 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:39.129387 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:05:43.535793 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:43.535758 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:05:51.023052 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:05:51.022999 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:06:01.329214 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:06:01.329180 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:06:09.234284 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:06:09.234244 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:06:19.830383 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:06:19.830346 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:06:29.036205 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:06:29.036167 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:06:39.942372 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:06:39.942335 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:07:40.966267 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:07:40.966232 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:07:56.533463 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:07:56.533421 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:08:35.632297 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:08:35.632263 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:08:48.830371 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:08:48.830342 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:08:48.831716 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:08:48.831694 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:08:50.847289 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:08:50.847256 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:08:55.235594 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:08:55.235556 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:09:06.731040 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:09:06.730990 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:09:22.332180 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:09:22.332148 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:10:16.236739 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:10:16.236649 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:10:25.836079 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:10:25.836042 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:10:42.535629 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:10:42.535594 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:10:51.031169 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:10:51.031137 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:11:08.047625 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:11:08.047583 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:11:16.038195 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:11:16.038161 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:11:49.328817 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:11:49.328777 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:11:56.931349 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:11:56.931311 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:12:05.931481 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:12:05.931443 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:12:14.224592 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:12:14.224556 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:12:22.433362 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:12:22.433323 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:12:40.133954 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:12:40.133922 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:12:51.729961 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:12:51.729926 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:13:38.630345 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:13:38.630308 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:13:46.231760 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:13:46.231725 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:13:48.861757 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:13:48.861729 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:13:48.864419 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:13:48.864398 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:13:55.630285 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:13:55.630247 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:14:03.233653 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:14:03.233616 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:14:13.631274 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:14:13.631242 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:14:21.133675 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:14:21.133636 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:14:30.727223 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:14:30.727184 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:14:39.528153 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:14:39.528119 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:14:48.532496 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:14:48.532457 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:14:56.133174 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:14:56.133139 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:15:00.141755 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.141707 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-8glnm"] Apr 20 15:15:00.142297 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.142274 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1de48a6-1f4b-423c-876a-b2151c966d67" containerName="authorino" Apr 20 15:15:00.142373 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.142301 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1de48a6-1f4b-423c-876a-b2151c966d67" containerName="authorino" Apr 20 15:15:00.142452 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.142439 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1de48a6-1f4b-423c-876a-b2151c966d67" containerName="authorino" Apr 20 15:15:00.145476 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.145455 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" Apr 20 15:15:00.147691 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.147667 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-t7lp5\"" Apr 20 15:15:00.159475 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.159447 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-8glnm"] Apr 20 15:15:00.260012 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.259981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvh7q\" (UniqueName: \"kubernetes.io/projected/73ccfa59-dc40-4bb4-9b07-c1e5abe79b61-kube-api-access-dvh7q\") pod \"maas-api-key-cleanup-29611635-8glnm\" (UID: \"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61\") " pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" Apr 20 15:15:00.360535 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.360501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvh7q\" (UniqueName: \"kubernetes.io/projected/73ccfa59-dc40-4bb4-9b07-c1e5abe79b61-kube-api-access-dvh7q\") pod \"maas-api-key-cleanup-29611635-8glnm\" (UID: \"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61\") " pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" Apr 20 15:15:00.368694 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.368668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvh7q\" (UniqueName: \"kubernetes.io/projected/73ccfa59-dc40-4bb4-9b07-c1e5abe79b61-kube-api-access-dvh7q\") pod \"maas-api-key-cleanup-29611635-8glnm\" (UID: \"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61\") " pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" Apr 20 15:15:00.455579 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.455482 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" Apr 20 15:15:00.582465 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.582435 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-8glnm"] Apr 20 15:15:00.583882 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:15:00.583858 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73ccfa59_dc40_4bb4_9b07_c1e5abe79b61.slice/crio-787c26c45d14cecd0e495eaee397d686d021d6f19b346038752ee50e869aa108 WatchSource:0}: Error finding container 787c26c45d14cecd0e495eaee397d686d021d6f19b346038752ee50e869aa108: Status 404 returned error can't find the container with id 787c26c45d14cecd0e495eaee397d686d021d6f19b346038752ee50e869aa108 Apr 20 15:15:00.585578 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.585560 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:15:00.773931 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:00.773839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" event={"ID":"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61","Type":"ContainerStarted","Data":"787c26c45d14cecd0e495eaee397d686d021d6f19b346038752ee50e869aa108"} Apr 20 15:15:03.788069 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:03.788003 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" event={"ID":"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61","Type":"ContainerStarted","Data":"582d02f79733378b32aaa8e3460ad4698a3657b2b13fb678919bc48c10878c03"} Apr 20 15:15:03.805289 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:03.805237 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" podStartSLOduration=1.944813721 podStartE2EDuration="3.805223429s" podCreationTimestamp="2026-04-20 15:15:00 +0000 UTC" firstStartedPulling="2026-04-20 15:15:00.585714049 +0000 UTC m=+1272.278257526" lastFinishedPulling="2026-04-20 15:15:02.446123757 +0000 UTC m=+1274.138667234" observedRunningTime="2026-04-20 15:15:03.803435932 +0000 UTC m=+1275.495979444" watchObservedRunningTime="2026-04-20 15:15:03.805223429 +0000 UTC m=+1275.497766928" Apr 20 15:15:05.929462 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:05.929423 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:15:14.136446 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:14.136402 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:15:23.464456 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:23.464367 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:15:23.869813 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:23.869770 2575 generic.go:358] "Generic (PLEG): container finished" podID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerID="582d02f79733378b32aaa8e3460ad4698a3657b2b13fb678919bc48c10878c03" exitCode=6 Apr 20 15:15:23.870040 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:23.869847 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" event={"ID":"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61","Type":"ContainerDied","Data":"582d02f79733378b32aaa8e3460ad4698a3657b2b13fb678919bc48c10878c03"} Apr 20 15:15:23.870213 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:23.870196 2575 scope.go:117] "RemoveContainer" containerID="582d02f79733378b32aaa8e3460ad4698a3657b2b13fb678919bc48c10878c03" Apr 20 15:15:24.874330 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:24.874297 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" event={"ID":"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61","Type":"ContainerStarted","Data":"a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c"} Apr 20 15:15:31.734749 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:31.734711 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:15:41.227377 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:41.227340 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:15:44.958580 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:44.958485 2575 generic.go:358] "Generic (PLEG): container finished" podID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerID="a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c" exitCode=6 Apr 20 15:15:44.958580 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:44.958560 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" event={"ID":"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61","Type":"ContainerDied","Data":"a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c"} Apr 20 15:15:44.959059 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:44.958605 2575 scope.go:117] "RemoveContainer" containerID="582d02f79733378b32aaa8e3460ad4698a3657b2b13fb678919bc48c10878c03" Apr 20 15:15:44.959059 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:44.958912 2575 scope.go:117] "RemoveContainer" containerID="a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c" Apr 20 15:15:44.959200 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:15:44.959180 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611635-8glnm_opendatahub(73ccfa59-dc40-4bb4-9b07-c1e5abe79b61)\"" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" Apr 20 15:15:48.634979 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:48.634936 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:15:56.864493 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:56.864459 2575 scope.go:117] "RemoveContainer" containerID="a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c" Apr 20 15:15:58.016222 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:58.016186 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" event={"ID":"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61","Type":"ContainerStarted","Data":"00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad"} Apr 20 15:15:58.444763 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:58.444726 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:15:59.044405 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:59.044273 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-8glnm"] Apr 20 15:15:59.044868 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:15:59.044668 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" containerID="cri-o://00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad" gracePeriod=30 Apr 20 15:16:06.935448 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:06.935365 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:16:17.794391 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:17.794365 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" Apr 20 15:16:17.858242 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:17.858163 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvh7q\" (UniqueName: \"kubernetes.io/projected/73ccfa59-dc40-4bb4-9b07-c1e5abe79b61-kube-api-access-dvh7q\") pod \"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61\" (UID: \"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61\") " Apr 20 15:16:17.860373 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:17.860345 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ccfa59-dc40-4bb4-9b07-c1e5abe79b61-kube-api-access-dvh7q" (OuterVolumeSpecName: "kube-api-access-dvh7q") pod "73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" (UID: "73ccfa59-dc40-4bb4-9b07-c1e5abe79b61"). InnerVolumeSpecName "kube-api-access-dvh7q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:16:17.959186 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:17.959147 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dvh7q\" (UniqueName: \"kubernetes.io/projected/73ccfa59-dc40-4bb4-9b07-c1e5abe79b61-kube-api-access-dvh7q\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:16:18.096745 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.096699 2575 generic.go:358] "Generic (PLEG): container finished" podID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerID="00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad" exitCode=6 Apr 20 15:16:18.096899 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.096773 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" Apr 20 15:16:18.096899 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.096767 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" event={"ID":"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61","Type":"ContainerDied","Data":"00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad"} Apr 20 15:16:18.096899 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.096888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611635-8glnm" event={"ID":"73ccfa59-dc40-4bb4-9b07-c1e5abe79b61","Type":"ContainerDied","Data":"787c26c45d14cecd0e495eaee397d686d021d6f19b346038752ee50e869aa108"} Apr 20 15:16:18.097007 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.096910 2575 scope.go:117] "RemoveContainer" containerID="00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad" Apr 20 15:16:18.105949 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.105931 2575 scope.go:117] "RemoveContainer" containerID="a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c" Apr 20 15:16:18.113738 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.113722 2575 scope.go:117] "RemoveContainer" containerID="00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad" Apr 20 15:16:18.113976 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:16:18.113955 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad\": container with ID starting with 00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad not found: ID does not exist" containerID="00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad" Apr 20 15:16:18.114036 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.113986 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad"} err="failed to get container status \"00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad\": rpc error: code = NotFound desc = could not find container \"00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad\": container with ID starting with 00235e088faa5c61d4efb3c628f88e90f78ddcecdacc6a0d84d7c3860f826aad not found: ID does not exist" Apr 20 15:16:18.114036 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.114003 2575 scope.go:117] "RemoveContainer" containerID="a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c" Apr 20 15:16:18.114245 ip-10-0-140-93 kubenswrapper[2575]: E0420 15:16:18.114231 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c\": container with ID starting with a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c not found: ID does not exist" containerID="a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c" Apr 20 15:16:18.114294 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.114248 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c"} err="failed to get container status \"a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c\": rpc error: code = NotFound desc = could not find container \"a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c\": container with ID starting with a12d1589f736033394fd22784414e4d35da299b5212d957c170b5b4d334ca94c not found: ID does not exist" Apr 20 15:16:18.125198 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.125176 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-8glnm"] Apr 20 15:16:18.129834 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.129813 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611635-8glnm"] Apr 20 15:16:18.869775 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:18.869741 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" path="/var/lib/kubelet/pods/73ccfa59-dc40-4bb4-9b07-c1e5abe79b61/volumes" Apr 20 15:16:58.571485 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:58.571449 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp"] Apr 20 15:16:58.571954 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:58.571676 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" podUID="c29776ad-873d-4da8-9ea6-fd3cde64ac51" containerName="manager" containerID="cri-o://465b6ae43d98eba94dd6229f6c0702c3260d9026513467af9131f2d645b796fb" gracePeriod=10 Apr 20 15:16:59.259095 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:59.259058 2575 generic.go:358] "Generic (PLEG): container finished" podID="c29776ad-873d-4da8-9ea6-fd3cde64ac51" containerID="465b6ae43d98eba94dd6229f6c0702c3260d9026513467af9131f2d645b796fb" exitCode=0 Apr 20 15:16:59.259290 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:59.259133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" event={"ID":"c29776ad-873d-4da8-9ea6-fd3cde64ac51","Type":"ContainerDied","Data":"465b6ae43d98eba94dd6229f6c0702c3260d9026513467af9131f2d645b796fb"} Apr 20 15:16:59.421887 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:59.421862 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:16:59.543714 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:59.543691 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hs4d\" (UniqueName: \"kubernetes.io/projected/c29776ad-873d-4da8-9ea6-fd3cde64ac51-kube-api-access-6hs4d\") pod \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\" (UID: \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\") " Apr 20 15:16:59.543865 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:59.543728 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c29776ad-873d-4da8-9ea6-fd3cde64ac51-extensions-socket-volume\") pod \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\" (UID: \"c29776ad-873d-4da8-9ea6-fd3cde64ac51\") " Apr 20 15:16:59.544169 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:59.544139 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c29776ad-873d-4da8-9ea6-fd3cde64ac51-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "c29776ad-873d-4da8-9ea6-fd3cde64ac51" (UID: "c29776ad-873d-4da8-9ea6-fd3cde64ac51"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:16:59.545866 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:59.545844 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c29776ad-873d-4da8-9ea6-fd3cde64ac51-kube-api-access-6hs4d" (OuterVolumeSpecName: "kube-api-access-6hs4d") pod "c29776ad-873d-4da8-9ea6-fd3cde64ac51" (UID: "c29776ad-873d-4da8-9ea6-fd3cde64ac51"). InnerVolumeSpecName "kube-api-access-6hs4d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:16:59.644702 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:59.644654 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6hs4d\" (UniqueName: \"kubernetes.io/projected/c29776ad-873d-4da8-9ea6-fd3cde64ac51-kube-api-access-6hs4d\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:16:59.644702 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:16:59.644698 2575 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c29776ad-873d-4da8-9ea6-fd3cde64ac51-extensions-socket-volume\") on node \"ip-10-0-140-93.ec2.internal\" DevicePath \"\"" Apr 20 15:17:00.263900 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:17:00.263861 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" event={"ID":"c29776ad-873d-4da8-9ea6-fd3cde64ac51","Type":"ContainerDied","Data":"3c93b8e51cb11910df86b0a1f226188a0ad42f46bcf8f1f89d6f5600e230e9ac"} Apr 20 15:17:00.264107 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:17:00.263905 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp" Apr 20 15:17:00.264107 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:17:00.263917 2575 scope.go:117] "RemoveContainer" containerID="465b6ae43d98eba94dd6229f6c0702c3260d9026513467af9131f2d645b796fb" Apr 20 15:17:00.286443 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:17:00.286417 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp"] Apr 20 15:17:00.292431 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:17:00.292401 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-zgpsp"] Apr 20 15:17:00.874685 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:17:00.874650 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c29776ad-873d-4da8-9ea6-fd3cde64ac51" path="/var/lib/kubelet/pods/c29776ad-873d-4da8-9ea6-fd3cde64ac51/volumes" Apr 20 15:18:04.744374 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744335 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26"] Apr 20 15:18:04.744778 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744742 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" Apr 20 15:18:04.744778 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744756 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" Apr 20 15:18:04.744778 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744766 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" Apr 20 15:18:04.744778 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744771 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" Apr 20 15:18:04.744912 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744790 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c29776ad-873d-4da8-9ea6-fd3cde64ac51" containerName="manager" Apr 20 15:18:04.744912 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744795 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c29776ad-873d-4da8-9ea6-fd3cde64ac51" containerName="manager" Apr 20 15:18:04.744912 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744857 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" Apr 20 15:18:04.744912 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744868 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c29776ad-873d-4da8-9ea6-fd3cde64ac51" containerName="manager" Apr 20 15:18:04.744912 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.744875 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" Apr 20 15:18:04.748320 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.748304 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:04.751364 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.751345 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6qqhv\"" Apr 20 15:18:04.757987 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.757964 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26"] Apr 20 15:18:04.810867 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.810811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/714ec6d4-4387-45a3-a172-c70910e24b1b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-sjk26\" (UID: \"714ec6d4-4387-45a3-a172-c70910e24b1b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:04.811213 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.811183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgsjc\" (UniqueName: \"kubernetes.io/projected/714ec6d4-4387-45a3-a172-c70910e24b1b-kube-api-access-qgsjc\") pod \"kuadrant-operator-controller-manager-55c7f4c975-sjk26\" (UID: \"714ec6d4-4387-45a3-a172-c70910e24b1b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:04.912195 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.912167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgsjc\" (UniqueName: \"kubernetes.io/projected/714ec6d4-4387-45a3-a172-c70910e24b1b-kube-api-access-qgsjc\") pod \"kuadrant-operator-controller-manager-55c7f4c975-sjk26\" (UID: \"714ec6d4-4387-45a3-a172-c70910e24b1b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:04.912340 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.912205 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/714ec6d4-4387-45a3-a172-c70910e24b1b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-sjk26\" (UID: \"714ec6d4-4387-45a3-a172-c70910e24b1b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:04.912543 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.912527 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/714ec6d4-4387-45a3-a172-c70910e24b1b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-sjk26\" (UID: \"714ec6d4-4387-45a3-a172-c70910e24b1b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:04.921099 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:04.921076 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgsjc\" (UniqueName: \"kubernetes.io/projected/714ec6d4-4387-45a3-a172-c70910e24b1b-kube-api-access-qgsjc\") pod \"kuadrant-operator-controller-manager-55c7f4c975-sjk26\" (UID: \"714ec6d4-4387-45a3-a172-c70910e24b1b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:05.060431 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:05.060385 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:05.191599 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:05.191576 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26"] Apr 20 15:18:05.192979 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:18:05.192948 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod714ec6d4_4387_45a3_a172_c70910e24b1b.slice/crio-b4f2338fc438b3a3ce79389865f7ed49ead0f3934b51ab555c8c344568a8f997 WatchSource:0}: Error finding container b4f2338fc438b3a3ce79389865f7ed49ead0f3934b51ab555c8c344568a8f997: Status 404 returned error can't find the container with id b4f2338fc438b3a3ce79389865f7ed49ead0f3934b51ab555c8c344568a8f997 Apr 20 15:18:05.513750 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:05.513671 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" event={"ID":"714ec6d4-4387-45a3-a172-c70910e24b1b","Type":"ContainerStarted","Data":"33e0f1c081bfc2244b90385d3424ad2c3d214c0166cba9cc3b34c8e06d5da024"} Apr 20 15:18:05.513750 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:05.513704 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" event={"ID":"714ec6d4-4387-45a3-a172-c70910e24b1b","Type":"ContainerStarted","Data":"b4f2338fc438b3a3ce79389865f7ed49ead0f3934b51ab555c8c344568a8f997"} Apr 20 15:18:05.513967 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:05.513790 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:05.534080 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:05.534007 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" podStartSLOduration=1.533995473 podStartE2EDuration="1.533995473s" podCreationTimestamp="2026-04-20 15:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:18:05.530582461 +0000 UTC m=+1457.223125960" watchObservedRunningTime="2026-04-20 15:18:05.533995473 +0000 UTC m=+1457.226538971" Apr 20 15:18:16.518954 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:16.518921 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-sjk26" Apr 20 15:18:22.933902 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:22.933868 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:18:30.629949 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:30.629910 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:18:48.900305 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:48.900279 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:18:48.903109 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:48.903090 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:18:55.031747 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:55.031702 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:18:59.428117 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:18:59.428073 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:19:10.336559 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:19:10.336521 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:19:20.334259 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:19:20.334224 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:19:28.428696 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:19:28.428657 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:19:39.230644 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:19:39.230605 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:19:48.437313 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:19:48.437280 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:19:59.033277 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:19:59.033242 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:20:08.332181 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:20:08.332146 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:20:18.528496 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:20:18.528462 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:20:27.831069 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:20:27.831035 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:21:02.543483 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:21:02.543445 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:21:46.125892 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:21:46.125856 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:21:56.130686 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:21:56.130650 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:22:05.532377 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:22:05.532290 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:22:14.032585 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:22:14.032550 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:22:21.438080 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:22:21.438036 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:22:32.939467 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:22:32.939432 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:22:42.032775 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:22:42.032740 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:22:50.436091 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:22:50.436049 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:22:58.132583 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:22:58.132543 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:23:06.144062 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:23:06.144009 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:23:15.432657 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:23:15.432623 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:23:25.236861 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:23:25.236826 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:23:43.230359 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:23:43.230326 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:23:48.936930 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:23:48.936905 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:23:48.940633 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:23:48.940611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:23:51.737718 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:23:51.737676 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:24:00.841058 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:24:00.840998 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:24:08.748470 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:24:08.748434 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:24:25.935225 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:24:25.935186 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:24:33.943395 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:24:33.943358 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:24:43.028720 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:24:43.028684 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:24:51.132990 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:24:51.132958 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:25:00.637476 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:25:00.637439 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:25:09.037330 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:25:09.037235 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:25:18.532906 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:25:18.532871 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:25:31.744823 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:25:31.744791 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:25:40.832598 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:25:40.832557 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:25:54.133514 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:25:54.133478 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:26:03.145510 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:26:03.145474 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:26:11.429773 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:26:11.429725 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:26:21.133702 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:26:21.133654 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:26:26.832379 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:26:26.832346 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:26:45.031421 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:26:45.031389 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:26:53.337599 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:26:53.337563 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:27:02.528619 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:02.528586 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:27:10.635997 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:10.635948 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:27:34.529945 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:34.529905 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:27:46.932861 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:46.932827 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-clp8b"] Apr 20 15:27:48.718695 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:48.718662 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-d49cd9b47-428ld_b3053491-a607-482c-8f52-67f92d2191ef/authorino/0.log" Apr 20 15:27:53.511971 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:53.511923 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-854569cf8c-g6m22_7f6d266c-54c0-444c-bc4f-a73bfe2997b7/manager/0.log" Apr 20 15:27:54.840682 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:54.840648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc_f775f5ff-4d61-4291-9a07-f05295333617/util/0.log" Apr 20 15:27:54.847503 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:54.847481 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc_f775f5ff-4d61-4291-9a07-f05295333617/pull/0.log" Apr 20 15:27:54.853874 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:54.853846 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc_f775f5ff-4d61-4291-9a07-f05295333617/extract/0.log" Apr 20 15:27:54.972618 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:54.972589 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x_35bbc67c-5dc1-440c-a1af-8f36a06c132d/extract/0.log" Apr 20 15:27:54.978950 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:54.978927 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x_35bbc67c-5dc1-440c-a1af-8f36a06c132d/util/0.log" Apr 20 15:27:54.985412 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:54.985394 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x_35bbc67c-5dc1-440c-a1af-8f36a06c132d/pull/0.log" Apr 20 15:27:55.099704 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.099628 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67_6c2f3f9c-039c-4593-bb74-577b0e5c7a18/extract/0.log" Apr 20 15:27:55.105945 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.105925 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67_6c2f3f9c-039c-4593-bb74-577b0e5c7a18/util/0.log" Apr 20 15:27:55.112048 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.112011 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67_6c2f3f9c-039c-4593-bb74-577b0e5c7a18/pull/0.log" Apr 20 15:27:55.228756 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.228732 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k_cf8d3902-d7ed-48e5-b3ea-1cb78d370f74/util/0.log" Apr 20 15:27:55.235015 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.234996 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k_cf8d3902-d7ed-48e5-b3ea-1cb78d370f74/pull/0.log" Apr 20 15:27:55.241552 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.241534 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k_cf8d3902-d7ed-48e5-b3ea-1cb78d370f74/extract/0.log" Apr 20 15:27:55.375669 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.375586 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-d49cd9b47-428ld_b3053491-a607-482c-8f52-67f92d2191ef/authorino/0.log" Apr 20 15:27:55.643058 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.642965 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-2hcj5_f9be6cbd-aafd-4893-94fe-67163f1587c3/manager/0.log" Apr 20 15:27:55.766514 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.766485 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-qmdj7_97f7f61e-7044-4647-b569-38e5b28ec72c/kuadrant-console-plugin/0.log" Apr 20 15:27:55.889824 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:55.889778 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-2ktsp_6ac24cef-df75-4a36-874e-f7a8348dc133/registry-server/0.log" Apr 20 15:27:56.011934 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:56.011825 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-sjk26_714ec6d4-4387-45a3-a172-c70910e24b1b/manager/0.log" Apr 20 15:27:56.136183 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:56.136153 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-clp8b_f4325a55-6660-47f1-b8a8-be49d04f9f2e/limitador/0.log" Apr 20 15:27:56.640989 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:56.640953 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fxphz7_f61a3200-3989-4fa9-9836-2a809a33573e/istio-proxy/0.log" Apr 20 15:27:57.132642 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:57.132610 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-zwxkw_8101843f-59d4-44da-8496-cc21d3abb48b/istio-proxy/0.log" Apr 20 15:27:57.271876 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:27:57.271853 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b6cfdf44-qtvqj_ec2d9386-fc11-40ea-b4da-f2e1aa8d435e/router/0.log" Apr 20 15:28:02.612220 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.612128 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrbql/must-gather-szwkl"] Apr 20 15:28:02.612669 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.612511 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" Apr 20 15:28:02.612669 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.612523 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" Apr 20 15:28:02.612669 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.612596 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="73ccfa59-dc40-4bb4-9b07-c1e5abe79b61" containerName="cleanup" Apr 20 15:28:02.615796 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.615775 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrbql/must-gather-szwkl" Apr 20 15:28:02.618209 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.618189 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrbql\"/\"openshift-service-ca.crt\"" Apr 20 15:28:02.619168 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.619153 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jrbql\"/\"default-dockercfg-whr8p\"" Apr 20 15:28:02.619256 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.619184 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrbql\"/\"kube-root-ca.crt\"" Apr 20 15:28:02.631069 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.631047 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrbql/must-gather-szwkl"] Apr 20 15:28:02.710688 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.710652 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff690356-b53b-45f8-bd1b-3f0d19d942cf-must-gather-output\") pod \"must-gather-szwkl\" (UID: \"ff690356-b53b-45f8-bd1b-3f0d19d942cf\") " pod="openshift-must-gather-jrbql/must-gather-szwkl" Apr 20 15:28:02.710844 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.710737 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4t96\" (UniqueName: \"kubernetes.io/projected/ff690356-b53b-45f8-bd1b-3f0d19d942cf-kube-api-access-v4t96\") pod \"must-gather-szwkl\" (UID: \"ff690356-b53b-45f8-bd1b-3f0d19d942cf\") " pod="openshift-must-gather-jrbql/must-gather-szwkl" Apr 20 15:28:02.811647 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.811609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4t96\" (UniqueName: \"kubernetes.io/projected/ff690356-b53b-45f8-bd1b-3f0d19d942cf-kube-api-access-v4t96\") pod \"must-gather-szwkl\" (UID: \"ff690356-b53b-45f8-bd1b-3f0d19d942cf\") " pod="openshift-must-gather-jrbql/must-gather-szwkl" Apr 20 15:28:02.811820 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.811699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff690356-b53b-45f8-bd1b-3f0d19d942cf-must-gather-output\") pod \"must-gather-szwkl\" (UID: \"ff690356-b53b-45f8-bd1b-3f0d19d942cf\") " pod="openshift-must-gather-jrbql/must-gather-szwkl" Apr 20 15:28:02.812184 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.812163 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ff690356-b53b-45f8-bd1b-3f0d19d942cf-must-gather-output\") pod \"must-gather-szwkl\" (UID: \"ff690356-b53b-45f8-bd1b-3f0d19d942cf\") " pod="openshift-must-gather-jrbql/must-gather-szwkl" Apr 20 15:28:02.820012 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.819993 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4t96\" (UniqueName: \"kubernetes.io/projected/ff690356-b53b-45f8-bd1b-3f0d19d942cf-kube-api-access-v4t96\") pod \"must-gather-szwkl\" (UID: \"ff690356-b53b-45f8-bd1b-3f0d19d942cf\") " pod="openshift-must-gather-jrbql/must-gather-szwkl" Apr 20 15:28:02.925811 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:02.925740 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrbql/must-gather-szwkl" Apr 20 15:28:03.055393 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:03.055367 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrbql/must-gather-szwkl"] Apr 20 15:28:03.056677 ip-10-0-140-93 kubenswrapper[2575]: W0420 15:28:03.056651 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff690356_b53b_45f8_bd1b_3f0d19d942cf.slice/crio-038c73267bbcfce133e7b235dc9f531e2d032134b87b7f009556e4baa6e8b56e WatchSource:0}: Error finding container 038c73267bbcfce133e7b235dc9f531e2d032134b87b7f009556e4baa6e8b56e: Status 404 returned error can't find the container with id 038c73267bbcfce133e7b235dc9f531e2d032134b87b7f009556e4baa6e8b56e Apr 20 15:28:03.058430 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:03.058412 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:28:03.894465 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:03.894406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrbql/must-gather-szwkl" event={"ID":"ff690356-b53b-45f8-bd1b-3f0d19d942cf","Type":"ContainerStarted","Data":"038c73267bbcfce133e7b235dc9f531e2d032134b87b7f009556e4baa6e8b56e"} Apr 20 15:28:04.901561 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:04.901521 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrbql/must-gather-szwkl" event={"ID":"ff690356-b53b-45f8-bd1b-3f0d19d942cf","Type":"ContainerStarted","Data":"661569a8b97ee1e807f7c9932a96736b8c42d1ef3fc9eba7aae9b2df72f178bd"} Apr 20 15:28:04.901952 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:04.901569 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrbql/must-gather-szwkl" event={"ID":"ff690356-b53b-45f8-bd1b-3f0d19d942cf","Type":"ContainerStarted","Data":"f54cf538f570e6044a4518bb505874704e8282b9ab052c57e0f026667b124130"} Apr 20 15:28:04.919492 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:04.919428 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrbql/must-gather-szwkl" podStartSLOduration=2.135736228 podStartE2EDuration="2.919393766s" podCreationTimestamp="2026-04-20 15:28:02 +0000 UTC" firstStartedPulling="2026-04-20 15:28:03.058556769 +0000 UTC m=+2054.751100247" lastFinishedPulling="2026-04-20 15:28:03.842214292 +0000 UTC m=+2055.534757785" observedRunningTime="2026-04-20 15:28:04.916562103 +0000 UTC m=+2056.609105604" watchObservedRunningTime="2026-04-20 15:28:04.919393766 +0000 UTC m=+2056.611937267" Apr 20 15:28:05.470626 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:05.470576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-r72kh_419c2e41-ed48-42a7-81ae-10358a918874/global-pull-secret-syncer/0.log" Apr 20 15:28:05.600736 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:05.600704 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vc6rt_842695c0-6492-4f06-9bbd-385652fc1969/konnectivity-agent/0.log" Apr 20 15:28:05.671916 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:05.671888 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-93.ec2.internal_1b1b2e50dbc39040c838976f73463d02/haproxy/0.log" Apr 20 15:28:09.050197 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.050153 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc_f775f5ff-4d61-4291-9a07-f05295333617/extract/0.log" Apr 20 15:28:09.072602 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.072566 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc_f775f5ff-4d61-4291-9a07-f05295333617/util/0.log" Apr 20 15:28:09.105394 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.105323 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759xthgc_f775f5ff-4d61-4291-9a07-f05295333617/pull/0.log" Apr 20 15:28:09.162386 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.162299 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x_35bbc67c-5dc1-440c-a1af-8f36a06c132d/extract/0.log" Apr 20 15:28:09.190472 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.190439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x_35bbc67c-5dc1-440c-a1af-8f36a06c132d/util/0.log" Apr 20 15:28:09.225488 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.225458 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e05w27x_35bbc67c-5dc1-440c-a1af-8f36a06c132d/pull/0.log" Apr 20 15:28:09.269645 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.269553 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67_6c2f3f9c-039c-4593-bb74-577b0e5c7a18/extract/0.log" Apr 20 15:28:09.317104 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.316990 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67_6c2f3f9c-039c-4593-bb74-577b0e5c7a18/util/0.log" Apr 20 15:28:09.346921 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.346895 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73chb67_6c2f3f9c-039c-4593-bb74-577b0e5c7a18/pull/0.log" Apr 20 15:28:09.384561 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.384535 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k_cf8d3902-d7ed-48e5-b3ea-1cb78d370f74/extract/0.log" Apr 20 15:28:09.413483 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.413457 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k_cf8d3902-d7ed-48e5-b3ea-1cb78d370f74/util/0.log" Apr 20 15:28:09.444606 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.444557 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1n9q8k_cf8d3902-d7ed-48e5-b3ea-1cb78d370f74/pull/0.log" Apr 20 15:28:09.713522 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.713404 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-d49cd9b47-428ld_b3053491-a607-482c-8f52-67f92d2191ef/authorino/0.log" Apr 20 15:28:09.806657 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.806622 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-2hcj5_f9be6cbd-aafd-4893-94fe-67163f1587c3/manager/0.log" Apr 20 15:28:09.836046 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.835982 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-qmdj7_97f7f61e-7044-4647-b569-38e5b28ec72c/kuadrant-console-plugin/0.log" Apr 20 15:28:09.876581 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.876548 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-2ktsp_6ac24cef-df75-4a36-874e-f7a8348dc133/registry-server/0.log" Apr 20 15:28:09.951481 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.951450 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-sjk26_714ec6d4-4387-45a3-a172-c70910e24b1b/manager/0.log" Apr 20 15:28:09.997524 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:09.997442 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-clp8b_f4325a55-6660-47f1-b8a8-be49d04f9f2e/limitador/0.log" Apr 20 15:28:11.391100 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.391072 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_22424d77-53b6-4c1b-9c41-b2f13a205d99/alertmanager/0.log" Apr 20 15:28:11.414742 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.414713 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_22424d77-53b6-4c1b-9c41-b2f13a205d99/config-reloader/0.log" Apr 20 15:28:11.437647 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.437619 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_22424d77-53b6-4c1b-9c41-b2f13a205d99/kube-rbac-proxy-web/0.log" Apr 20 15:28:11.469827 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.469802 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_22424d77-53b6-4c1b-9c41-b2f13a205d99/kube-rbac-proxy/0.log" Apr 20 15:28:11.498162 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.498132 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_22424d77-53b6-4c1b-9c41-b2f13a205d99/kube-rbac-proxy-metric/0.log" Apr 20 15:28:11.554121 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.554078 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_22424d77-53b6-4c1b-9c41-b2f13a205d99/prom-label-proxy/0.log" Apr 20 15:28:11.612885 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.612843 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_22424d77-53b6-4c1b-9c41-b2f13a205d99/init-config-reloader/0.log" Apr 20 15:28:11.662705 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.662628 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-bp8pr_abfd5746-0891-4e15-9237-6631a29b8009/cluster-monitoring-operator/0.log" Apr 20 15:28:11.813303 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.813274 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-wdcrf_3f215683-8eed-4b7b-9022-2e52ec9b239d/monitoring-plugin/0.log" Apr 20 15:28:11.858046 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.857985 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bmrzc_b9528386-7ec4-424d-aa07-1eca20561056/node-exporter/0.log" Apr 20 15:28:11.896164 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.896132 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bmrzc_b9528386-7ec4-424d-aa07-1eca20561056/kube-rbac-proxy/0.log" Apr 20 15:28:11.929417 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:11.929347 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bmrzc_b9528386-7ec4-424d-aa07-1eca20561056/init-textfile/0.log" Apr 20 15:28:12.645852 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:12.645810 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f5ccfc466-mqcfd_bfbf3447-579f-420e-b165-be48ea35efad/thanos-query/0.log" Apr 20 15:28:12.667040 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:12.666994 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f5ccfc466-mqcfd_bfbf3447-579f-420e-b165-be48ea35efad/kube-rbac-proxy-web/0.log" Apr 20 15:28:12.691679 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:12.691584 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f5ccfc466-mqcfd_bfbf3447-579f-420e-b165-be48ea35efad/kube-rbac-proxy/0.log" Apr 20 15:28:12.715097 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:12.715069 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f5ccfc466-mqcfd_bfbf3447-579f-420e-b165-be48ea35efad/prom-label-proxy/0.log" Apr 20 15:28:12.747215 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:12.747183 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f5ccfc466-mqcfd_bfbf3447-579f-420e-b165-be48ea35efad/kube-rbac-proxy-rules/0.log" Apr 20 15:28:12.767756 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:12.767727 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5f5ccfc466-mqcfd_bfbf3447-579f-420e-b165-be48ea35efad/kube-rbac-proxy-metrics/0.log" Apr 20 15:28:14.600266 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.600231 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm"] Apr 20 15:28:14.606115 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.606084 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.612128 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.612099 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm"] Apr 20 15:28:14.743760 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.743722 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-proc\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.743947 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.743775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-podres\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.743947 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.743870 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-sys\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.743947 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.743940 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-lib-modules\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.744181 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.744002 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtch6\" (UniqueName: \"kubernetes.io/projected/3705357a-5064-4f50-9337-90be1a4adfda-kube-api-access-xtch6\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.840006 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.839977 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-d74bcc6f7-mfhfl_06626336-d19f-4aae-9c9d-1d525e4876fd/console/0.log" Apr 20 15:28:14.844545 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.844512 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-proc\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.844721 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.844551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-podres\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.844721 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.844568 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-sys\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.844721 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.844634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-sys\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.844721 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.844648 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-proc\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.844721 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.844683 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-podres\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.844721 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.844694 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-lib-modules\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.844977 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.844759 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtch6\" (UniqueName: \"kubernetes.io/projected/3705357a-5064-4f50-9337-90be1a4adfda-kube-api-access-xtch6\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.844977 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.844857 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3705357a-5064-4f50-9337-90be1a4adfda-lib-modules\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.852850 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.852792 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtch6\" (UniqueName: \"kubernetes.io/projected/3705357a-5064-4f50-9337-90be1a4adfda-kube-api-access-xtch6\") pod \"perf-node-gather-daemonset-9qpcm\" (UID: \"3705357a-5064-4f50-9337-90be1a4adfda\") " pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:14.869574 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.869534 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-v5stc_4ccad5fe-a610-4440-ad4f-3cba3e926719/download-server/0.log" Apr 20 15:28:14.921122 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:14.921079 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:15.312276 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:15.311449 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm"] Apr 20 15:28:15.423009 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:15.422986 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-82wgd_10750e47-7592-4544-9e16-62ee13fcf036/volume-data-source-validator/0.log" Apr 20 15:28:15.965527 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:15.965485 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" event={"ID":"3705357a-5064-4f50-9337-90be1a4adfda","Type":"ContainerStarted","Data":"b1c10058d286a7ef171da1bba5ad96be1f28a7720add30316f44532135f19679"} Apr 20 15:28:15.965994 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:15.965532 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" event={"ID":"3705357a-5064-4f50-9337-90be1a4adfda","Type":"ContainerStarted","Data":"c02b9c344aa9f5e07343d496417701a16e9092802e2d5488beb444f423d45c9f"} Apr 20 15:28:15.965994 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:15.965648 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:15.983551 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:15.983494 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" podStartSLOduration=1.983478448 podStartE2EDuration="1.983478448s" podCreationTimestamp="2026-04-20 15:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:28:15.98060632 +0000 UTC m=+2067.673149820" watchObservedRunningTime="2026-04-20 15:28:15.983478448 +0000 UTC m=+2067.676021947" Apr 20 15:28:16.185077 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:16.185054 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8swcv_e19d0653-6009-45aa-a269-a68af8375182/dns/0.log" Apr 20 15:28:16.205655 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:16.205626 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8swcv_e19d0653-6009-45aa-a269-a68af8375182/kube-rbac-proxy/0.log" Apr 20 15:28:16.325913 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:16.325889 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2z95t_3004f37c-e216-4320-82a5-11c7c7fe8be1/dns-node-resolver/0.log" Apr 20 15:28:16.825764 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:16.825734 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7bdb7cb467-cqn7c_2fc25d12-330f-41bf-82d8-7158f40715bf/registry/0.log" Apr 20 15:28:16.869603 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:16.869577 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nwk2q_e674f45e-6036-47da-a806-c40040927fba/node-ca/0.log" Apr 20 15:28:17.717958 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:17.717928 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fxphz7_f61a3200-3989-4fa9-9836-2a809a33573e/istio-proxy/0.log" Apr 20 15:28:17.984620 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:17.984537 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-zwxkw_8101843f-59d4-44da-8496-cc21d3abb48b/istio-proxy/0.log" Apr 20 15:28:18.004885 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:18.004863 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7b6cfdf44-qtvqj_ec2d9386-fc11-40ea-b4da-f2e1aa8d435e/router/0.log" Apr 20 15:28:18.579081 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:18.579053 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-w22hl_0e64e671-ff76-45fc-b205-a75b74329230/serve-healthcheck-canary/0.log" Apr 20 15:28:19.001305 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:19.001221 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-7rn4l_34f2a619-5bb8-4702-ae7e-217e448429bc/insights-operator/0.log" Apr 20 15:28:19.002600 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:19.002576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-7rn4l_34f2a619-5bb8-4702-ae7e-217e448429bc/insights-operator/1.log" Apr 20 15:28:19.129810 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:19.129780 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s2sjx_433b9f99-bb71-468d-9a18-83234a426f09/kube-rbac-proxy/0.log" Apr 20 15:28:19.151773 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:19.151739 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s2sjx_433b9f99-bb71-468d-9a18-83234a426f09/exporter/0.log" Apr 20 15:28:19.173678 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:19.173651 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s2sjx_433b9f99-bb71-468d-9a18-83234a426f09/extractor/0.log" Apr 20 15:28:21.358162 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:21.358128 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-854569cf8c-g6m22_7f6d266c-54c0-444c-bc4f-a73bfe2997b7/manager/0.log" Apr 20 15:28:21.983733 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:21.983703 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jrbql/perf-node-gather-daemonset-9qpcm" Apr 20 15:28:22.618041 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:22.618000 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-54f6c466b9-w6shx_22b1eef0-9570-4084-861a-bee77a3ad8ef/manager/0.log" Apr 20 15:28:27.117803 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:27.117775 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-f49kv_83a12d4d-86d3-4c54-a107-3a8275bd04db/migrator/0.log" Apr 20 15:28:27.137803 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:27.137774 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-f49kv_83a12d4d-86d3-4c54-a107-3a8275bd04db/graceful-termination/0.log" Apr 20 15:28:28.749037 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:28.748998 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9ckk_4735b84c-6b45-45bb-8802-627a40d45e62/kube-multus-additional-cni-plugins/0.log" Apr 20 15:28:28.773072 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:28.773045 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9ckk_4735b84c-6b45-45bb-8802-627a40d45e62/egress-router-binary-copy/0.log" Apr 20 15:28:28.794462 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:28.794439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9ckk_4735b84c-6b45-45bb-8802-627a40d45e62/cni-plugins/0.log" Apr 20 15:28:28.814910 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:28.814886 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9ckk_4735b84c-6b45-45bb-8802-627a40d45e62/bond-cni-plugin/0.log" Apr 20 15:28:28.834590 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:28.834562 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9ckk_4735b84c-6b45-45bb-8802-627a40d45e62/routeoverride-cni/0.log" Apr 20 15:28:28.853809 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:28.853784 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9ckk_4735b84c-6b45-45bb-8802-627a40d45e62/whereabouts-cni-bincopy/0.log" Apr 20 15:28:28.874639 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:28.874619 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p9ckk_4735b84c-6b45-45bb-8802-627a40d45e62/whereabouts-cni/0.log" Apr 20 15:28:29.075595 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:29.075545 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bcj2f_37f9fe2f-e8a7-4419-9f1d-a937dfadf1c1/kube-multus/0.log" Apr 20 15:28:29.184742 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:29.184706 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gpcl9_ae2439f5-03aa-43b8-9466-c01fbcb53912/network-metrics-daemon/0.log" Apr 20 15:28:29.203635 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:29.203611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gpcl9_ae2439f5-03aa-43b8-9466-c01fbcb53912/kube-rbac-proxy/0.log" Apr 20 15:28:30.048702 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:30.048672 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-controller/0.log" Apr 20 15:28:30.064486 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:30.064457 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/0.log" Apr 20 15:28:30.076759 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:30.076731 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovn-acl-logging/1.log" Apr 20 15:28:30.097068 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:30.097042 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/kube-rbac-proxy-node/0.log" Apr 20 15:28:30.118859 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:30.118833 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:28:30.137009 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:30.136975 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/northd/0.log" Apr 20 15:28:30.156230 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:30.156205 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/nbdb/0.log" Apr 20 15:28:30.180644 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:30.180617 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/sbdb/0.log" Apr 20 15:28:30.298852 ip-10-0-140-93 kubenswrapper[2575]: I0420 15:28:30.298779 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4d2sl_a1a5ae57-d482-4ab1-94d0-99811e6761ea/ovnkube-controller/0.log"