Apr 17 17:26:13.300483 ip-10-0-137-109 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:26:13.770762 ip-10-0-137-109 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:26:13.770762 ip-10-0-137-109 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:26:13.770762 ip-10-0-137-109 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:26:13.770762 ip-10-0-137-109 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:26:13.770762 ip-10-0-137-109 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:26:13.772551 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.772464 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:26:13.775804 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775777 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:13.775804 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775800 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:13.775804 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775803 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:13.775804 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775807 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:13.775804 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775810 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:13.775804 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775813 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:13.775804 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775816 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775820 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775822 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775825 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775828 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775831 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775833 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775836 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775838 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775841 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775844 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775848 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775851 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775854 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775857 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775865 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775868 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775871 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775873 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775876 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:13.776095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775878 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775881 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775884 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775886 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775889 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775891 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775894 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775897 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775900 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775902 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775905 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775907 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775910 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775912 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775915 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775918 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775921 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775924 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775927 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775931 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:13.776571 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775934 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775937 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775940 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775943 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775945 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775948 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775950 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775953 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775956 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775958 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775961 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775963 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775966 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775968 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775971 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775974 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775977 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775981 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775986 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:13.777098 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775989 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775992 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775995 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.775998 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776001 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776003 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776006 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776009 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776012 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776015 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776017 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776035 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776038 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776045 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776047 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776050 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776053 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776055 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776058 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776061 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:13.777556 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776063 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776459 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776464 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776467 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776470 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776473 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776476 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776478 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776481 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776484 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776486 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776489 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776492 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776494 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776497 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776502 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776506 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776509 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776512 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:13.778092 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776516 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776519 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776523 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776525 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776528 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776530 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776534 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776537 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776539 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776541 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776544 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776546 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776550 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776553 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776556 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776560 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776563 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776566 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776568 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:13.778552 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776571 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776573 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776576 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776579 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776581 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776584 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776587 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776590 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776593 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776595 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776598 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776600 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776603 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776606 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776608 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776611 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776613 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776616 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776618 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776621 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:13.779181 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776624 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776626 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776629 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776632 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776635 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776637 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776640 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776642 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776646 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776648 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776651 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776654 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776657 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776659 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776662 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776666 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776668 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776671 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776673 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776675 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:13.779680 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776678 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776680 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776683 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776685 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776688 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776690 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776693 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776696 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.776698 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778117 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778129 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778136 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778141 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778147 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778151 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778155 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778160 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778164 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778167 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778170 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778174 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778177 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:26:13.780193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778181 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778184 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778187 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778190 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778193 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778196 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778200 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778204 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778207 2576 flags.go:64] FLAG: --config-dir="" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778210 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778214 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778218 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778221 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778224 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778228 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778231 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778234 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778237 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778241 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778243 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778248 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778250 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778253 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778256 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778261 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 17:26:13.780764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778264 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778270 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778273 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778277 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778280 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778284 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778288 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778291 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778294 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778297 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778300 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778303 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778306 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778309 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778312 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778315 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778318 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778322 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778325 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778328 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778331 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778334 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778337 2576 flags.go:64] FLAG: --help="false" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778339 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-137-109.ec2.internal" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778343 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:26:13.781391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778346 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778349 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778352 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778356 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778359 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778361 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778365 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778368 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778371 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778375 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778377 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778380 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778383 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778386 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778389 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778392 2576 flags.go:64] FLAG: --lock-file="" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778395 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778398 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778401 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778407 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778410 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778412 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778416 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 17:26:13.782001 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778418 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778422 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778424 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778427 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778431 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778434 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778438 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778441 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778444 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778447 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778450 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778453 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778456 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778458 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778467 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778470 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778473 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778477 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778480 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778486 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778488 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778492 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778495 2576 flags.go:64] FLAG: --port="10250" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778498 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:26:13.782559 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778501 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00a640b49433bde1c" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778505 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778508 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778511 2576 flags.go:64] FLAG: --register-node="true" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778513 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778516 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778520 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778523 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778526 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778529 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778533 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778536 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778539 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778542 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778544 2576 flags.go:64] FLAG: --runonce="false" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778548 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778551 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778554 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778557 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778560 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778563 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778566 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778569 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778572 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778575 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778578 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:26:13.783162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778581 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778585 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778588 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778591 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778597 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778600 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778603 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778607 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778610 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778613 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778616 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778620 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778623 2576 flags.go:64] FLAG: --v="2" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778627 2576 flags.go:64] FLAG: --version="false" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778631 2576 flags.go:64] FLAG: --vmodule="" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778636 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778640 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778735 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778739 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778742 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778744 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778747 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778750 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:13.783793 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778753 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778759 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778762 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778764 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778767 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778770 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778773 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778775 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778778 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778782 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778786 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778789 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778793 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778796 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778799 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778801 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778804 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778807 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778809 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:13.784378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778812 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778815 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778817 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778822 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778825 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778828 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778831 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778834 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778836 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778839 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778842 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778845 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778848 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778850 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778855 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778857 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778860 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778863 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778865 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778867 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:13.784915 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778870 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778873 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778875 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778878 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778881 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778884 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778888 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778891 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778893 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778896 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778898 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778901 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778903 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778906 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778908 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778911 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778913 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778916 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778918 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778921 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:13.785417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778923 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778926 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778929 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778932 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778934 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778937 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778941 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778943 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778946 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778948 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778951 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778953 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778956 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778959 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778961 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778964 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778966 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778970 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778973 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778976 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:13.785920 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.778978 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.778987 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.785767 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.785785 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785850 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785857 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785860 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785864 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785866 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785869 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785872 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785875 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785878 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785881 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785884 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785886 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:13.786486 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785889 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785892 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785894 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785897 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785900 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785902 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785905 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785908 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785910 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785913 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785916 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785918 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785921 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785924 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785927 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785930 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785933 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785936 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785939 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785942 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:13.786895 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785945 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785948 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785952 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785956 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785959 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785962 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785965 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785969 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785973 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785976 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785979 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785981 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.785984 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786000 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786003 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786007 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786010 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786012 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786015 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:13.787446 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786018 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786037 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786041 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786044 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786046 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786051 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786053 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786057 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786060 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786063 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786065 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786068 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786072 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786074 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786077 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786080 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786083 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786086 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786089 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786092 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:13.787948 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786095 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786097 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786100 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786102 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786105 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786108 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786110 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786113 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786116 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786118 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786121 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786124 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786126 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786129 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786131 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:13.788588 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.786136 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786237 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786242 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786246 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786249 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786253 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786255 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786258 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786261 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786264 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786266 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786269 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786271 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786274 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786276 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786279 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786282 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786284 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786287 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786289 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:26:13.788981 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786292 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786294 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786297 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786299 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786302 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786304 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786307 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786310 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786312 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786315 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786317 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786320 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786324 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786327 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786330 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786333 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786336 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786339 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786342 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:26:13.789526 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786345 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786348 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786351 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786353 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786356 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786358 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786361 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786363 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786366 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786368 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786371 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786374 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786376 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786379 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786381 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786384 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786386 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786389 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786391 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:26:13.790037 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786394 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786396 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786399 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786402 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786404 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786407 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786409 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786412 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786414 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786417 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786419 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786422 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786424 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786431 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786434 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786436 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786438 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786441 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786444 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786446 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:26:13.790515 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786448 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786451 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786453 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786456 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786458 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786460 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786463 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786465 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:13.786469 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.786474 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.787233 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:26:13.791017 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.790083 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:26:13.791337 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.791017 2576 server.go:1019] "Starting client certificate rotation" Apr 17 17:26:13.791337 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.791126 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:26:13.791337 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.791164 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:26:13.815504 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.815479 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:26:13.821168 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.821124 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:26:13.836955 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.836930 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:26:13.842778 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.842759 2576 log.go:25] "Validated CRI v1 image API" Apr 17 17:26:13.843983 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.843969 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:26:13.847366 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.847338 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 808dbc6a-12a3-486e-8c87-c2d8f714e51c:/dev/nvme0n1p3 ae85e425-65ef-4eb3-a075-30e70f34c378:/dev/nvme0n1p4] Apr 17 17:26:13.847455 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.847364 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:26:13.848751 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.848733 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:26:13.852570 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.852443 2576 manager.go:217] Machine: {Timestamp:2026-04-17 17:26:13.851222755 +0000 UTC m=+0.426571035 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3108205 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec210533b75e4832d5059ff55a5ff10b SystemUUID:ec210533-b75e-4832-d505-9ff55a5ff10b BootID:218ce415-e3d9-4386-afe0-35426ed3b316 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:32:9b:c0:25:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:32:9b:c0:25:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0a:32:bf:b9:45:9f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:26:13.852570 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.852558 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:26:13.852719 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.852649 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:26:13.854324 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.854292 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:26:13.854504 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.854328 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-109.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:26:13.854585 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.854522 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:26:13.854585 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.854535 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:26:13.854585 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.854554 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:26:13.855616 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.855603 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:26:13.857436 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.857421 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:26:13.857596 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.857584 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:26:13.860066 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.860051 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:26:13.860143 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.860074 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:26:13.860143 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.860090 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:26:13.860143 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.860103 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:26:13.860143 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.860117 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:26:13.861353 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.861338 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:26:13.861420 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.861363 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:26:13.864786 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.864765 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:26:13.866426 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.866411 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:26:13.868878 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868866 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:26:13.868929 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868884 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:26:13.868929 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868890 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:26:13.868929 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868896 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:26:13.868929 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868902 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:26:13.868929 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868907 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:26:13.868929 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868913 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:26:13.868929 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868919 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:26:13.868929 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868925 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:26:13.868929 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868931 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:26:13.869189 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868940 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:26:13.869189 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.868949 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:26:13.869835 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.869802 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:26:13.869835 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.869812 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:26:13.876985 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.876953 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-109.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:26:13.876985 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.876971 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-109.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:26:13.877138 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.876971 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:26:13.877518 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.877502 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:26:13.877667 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.877565 2576 server.go:1295] "Started kubelet" Apr 17 17:26:13.877713 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.877635 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:26:13.877755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.877709 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:26:13.877795 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.877786 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:26:13.878401 ip-10-0-137-109 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:26:13.878932 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.878913 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:26:13.879452 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.879442 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:26:13.884057 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.884034 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:26:13.884660 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.884645 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:26:13.885581 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885384 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:26:13.885581 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885384 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:26:13.885581 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885417 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:26:13.885581 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885452 2576 factory.go:55] Registering systemd factory Apr 17 17:26:13.885581 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885499 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:26:13.885581 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885508 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:26:13.885581 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885513 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:26:13.885912 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.885750 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:13.885912 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885852 2576 factory.go:153] Registering CRI-O factory Apr 17 17:26:13.885912 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885866 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 17:26:13.886061 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885934 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:26:13.886061 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885959 2576 factory.go:103] Registering Raw factory Apr 17 17:26:13.886061 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.885976 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 17:26:13.886402 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.886388 2576 manager.go:319] Starting recovery of all containers Apr 17 17:26:13.887410 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.887385 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:26:13.895154 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.895129 2576 manager.go:324] Recovery completed Apr 17 17:26:13.895476 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.895446 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-109.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:26:13.895797 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.895770 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:26:13.896762 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.896736 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8ln77" Apr 17 17:26:13.896916 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.895561 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-109.ec2.internal.18a734edc75bb291 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-109.ec2.internal,UID:ip-10-0-137-109.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-109.ec2.internal,},FirstTimestamp:2026-04-17 17:26:13.877518993 +0000 UTC m=+0.452867274,LastTimestamp:2026-04-17 17:26:13.877518993 +0000 UTC m=+0.452867274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-109.ec2.internal,}" Apr 17 17:26:13.901669 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.901655 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:13.902914 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.902898 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8ln77" Apr 17 17:26:13.904722 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.904704 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:13.904722 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.904733 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:13.904854 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.904744 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:13.905305 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.905287 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:26:13.905365 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.905304 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:26:13.905365 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.905324 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:26:13.906498 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.906431 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-109.ec2.internal.18a734edc8fac03a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-109.ec2.internal,UID:ip-10-0-137-109.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-109.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-109.ec2.internal,},FirstTimestamp:2026-04-17 17:26:13.90471993 +0000 UTC m=+0.480068209,LastTimestamp:2026-04-17 17:26:13.90471993 +0000 UTC m=+0.480068209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-109.ec2.internal,}" Apr 17 17:26:13.908116 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.908104 2576 policy_none.go:49] "None policy: Start" Apr 17 17:26:13.908165 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.908120 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:26:13.908165 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.908130 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:26:13.951680 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.951663 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 17:26:13.967407 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.951708 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:26:13.967407 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.951718 2576 server.go:85] "Starting device plugin registration server" Apr 17 17:26:13.967407 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.951959 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:26:13.967407 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.951970 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:26:13.967407 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.952081 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:26:13.967407 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.952193 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:26:13.967407 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.952201 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:26:13.967407 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.952819 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:26:13.967407 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.952864 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:13.995048 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.994996 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:26:13.996310 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.996290 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:26:13.996391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.996329 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:26:13.996391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.996351 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:26:13.996391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.996362 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:26:13.996539 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:13.996459 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:26:13.999306 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:13.999281 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:14.052446 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.052363 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:14.053509 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.053492 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:14.053575 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.053523 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:14.053575 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.053535 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:14.053575 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.053561 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.065221 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.065194 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.065302 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.065224 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-109.ec2.internal\": node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:14.081523 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.081494 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:14.097309 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.097262 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal"] Apr 17 17:26:14.097453 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.097353 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:14.098764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.098746 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:14.098861 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.098779 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:14.098861 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.098791 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:14.100124 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.100110 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:14.100292 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.100279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.100327 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.100313 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:14.100927 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.100907 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:14.100927 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.100919 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:14.101076 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.100939 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:14.101076 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.100940 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:14.101076 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.100972 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:14.101076 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.100960 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:14.102251 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.102233 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.102345 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.102264 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:26:14.102941 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.102926 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:26:14.103008 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.102947 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:26:14.103008 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.102956 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:26:14.129961 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.129931 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-109.ec2.internal\" not found" node="ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.134594 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.134575 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-109.ec2.internal\" not found" node="ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.181839 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.181809 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:14.187158 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.187131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a191cd1b01780dbedd511a0221fb1c6-config\") pod \"kube-apiserver-proxy-ip-10-0-137-109.ec2.internal\" (UID: \"2a191cd1b01780dbedd511a0221fb1c6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.187260 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.187171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/db9baa965611dba5e7069bbf2b1b2600-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal\" (UID: \"db9baa965611dba5e7069bbf2b1b2600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.187260 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.187205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db9baa965611dba5e7069bbf2b1b2600-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal\" (UID: \"db9baa965611dba5e7069bbf2b1b2600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.282318 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.282273 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:14.287622 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.287607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db9baa965611dba5e7069bbf2b1b2600-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal\" (UID: \"db9baa965611dba5e7069bbf2b1b2600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.287682 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.287631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a191cd1b01780dbedd511a0221fb1c6-config\") pod \"kube-apiserver-proxy-ip-10-0-137-109.ec2.internal\" (UID: \"2a191cd1b01780dbedd511a0221fb1c6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.287682 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.287651 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/db9baa965611dba5e7069bbf2b1b2600-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal\" (UID: \"db9baa965611dba5e7069bbf2b1b2600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.287747 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.287717 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/db9baa965611dba5e7069bbf2b1b2600-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal\" (UID: \"db9baa965611dba5e7069bbf2b1b2600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.287747 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.287722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2a191cd1b01780dbedd511a0221fb1c6-config\") pod \"kube-apiserver-proxy-ip-10-0-137-109.ec2.internal\" (UID: \"2a191cd1b01780dbedd511a0221fb1c6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.287804 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.287725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/db9baa965611dba5e7069bbf2b1b2600-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal\" (UID: \"db9baa965611dba5e7069bbf2b1b2600\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.383159 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.383071 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:14.431636 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.431598 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.437435 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.437406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.484262 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.484225 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:14.584735 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.584687 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:14.685243 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.685166 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-109.ec2.internal\" not found" Apr 17 17:26:14.747180 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.747147 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:14.785978 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.785939 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.791256 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.791092 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:26:14.791383 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.791363 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:26:14.791432 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.791407 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:26:14.791475 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.791445 2576 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a995bbf1b2a844135aa0a80da29e8b28-211be44c789c746a.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.137.109:56354->3.209.244.162:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.791512 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.791479 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" Apr 17 17:26:14.810966 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.810944 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:26:14.861298 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.861269 2576 apiserver.go:52] "Watching apiserver" Apr 17 17:26:14.869776 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.869750 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:26:14.870500 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.870479 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal","openshift-multus/network-metrics-daemon-l28wh","openshift-ovn-kubernetes/ovnkube-node-d5gqt","kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf","openshift-cluster-node-tuning-operator/tuned-fmj2l","openshift-image-registry/node-ca-wxqd8","openshift-multus/multus-additional-cni-plugins-tpw89","openshift-multus/multus-f76vg","openshift-network-diagnostics/network-check-target-494hr","openshift-network-operator/iptables-alerter-bqt6x","kube-system/konnectivity-agent-xn5kt","openshift-dns/node-resolver-lrfkm"] Apr 17 17:26:14.872242 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.872216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:14.872433 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.872394 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:14.875262 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.875237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.875374 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.875283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.876472 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.876452 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.877891 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.877870 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:26:14.877891 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.877890 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:26:14.878038 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.877899 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-phpj9\"" Apr 17 17:26:14.878969 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.878949 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:14.878969 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.878960 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:26:14.879134 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.878992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:26:14.879134 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.878962 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:26:14.879134 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.879090 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:26:14.879134 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.879119 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5p7zn\"" Apr 17 17:26:14.879347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.879135 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:26:14.879347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.879148 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:26:14.879347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.879196 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:26:14.879347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.879245 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:26:14.879347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.879300 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:26:14.879557 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.879455 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-np6vd\"" Apr 17 17:26:14.880287 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.880271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:14.880551 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.880471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.881385 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.881372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:14.881446 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:14.881423 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:14.881760 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.881745 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:26:14.881760 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.881768 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:26:14.881919 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.881775 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:26:14.882017 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.882005 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kltqm\"" Apr 17 17:26:14.882494 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.882479 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:14.882850 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.882833 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:26:14.882850 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.882844 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:26:14.882975 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.882839 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:26:14.883175 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.883159 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:26:14.883175 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.883185 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:26:14.883175 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.883209 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xk74t\"" Apr 17 17:26:14.883428 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.883194 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d6cjm\"" Apr 17 17:26:14.883428 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.883276 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:26:14.883831 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.883817 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:14.884124 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.884111 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:26:14.885034 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.885007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:14.885084 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.885059 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:26:14.885415 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.885401 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ljwph\"" Apr 17 17:26:14.885488 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.885466 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:26:14.885488 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.885480 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:26:14.886567 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.886551 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:26:14.886680 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.886627 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7s5z8\"" Apr 17 17:26:14.886760 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.886745 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:26:14.886826 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.886787 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:26:14.887596 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.887577 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:26:14.887596 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.887587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-862sf\"" Apr 17 17:26:14.887731 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.887609 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:26:14.890461 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890439 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-run-netns\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.890537 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.890537 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-var-lib-cni-multus\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.890537 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-etc-kubernetes\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.890649 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890553 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-sysctl-d\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.890649 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgccz\" (UniqueName: \"kubernetes.io/projected/9a695a79-10fa-428f-95e4-f0cdc8dea701-kube-api-access-kgccz\") pod \"iptables-alerter-bqt6x\" (UID: \"9a695a79-10fa-428f-95e4-f0cdc8dea701\") " pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:14.890649 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-sys-fs\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.890777 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-sys\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.890777 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-host\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.890777 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-cni-bin\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.890900 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-env-overrides\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.890900 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-multus-cni-dir\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.890900 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890878 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-multus-socket-dir-parent\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.891007 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890907 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-run-netns\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.891007 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.891007 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.891007 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-os-release\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.891007 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.890998 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-multus-conf-dir\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.891236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9a695a79-10fa-428f-95e4-f0cdc8dea701-iptables-alerter-script\") pod \"iptables-alerter-bqt6x\" (UID: \"9a695a79-10fa-428f-95e4-f0cdc8dea701\") " pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:14.891236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-run\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.891236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-run-ovn\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.891236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-cnibin\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:14.891236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3d6308be-3783-4dbc-bf78-b2d0765d73c8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:14.891236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891199 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-system-cni-dir\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.891236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:14.891236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-run-k8s-cni-cncf-io\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.891517 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891253 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klwdm\" (UniqueName: \"kubernetes.io/projected/5825282c-f2bb-4812-ae37-269c52c423f7-kube-api-access-klwdm\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.891517 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-slash\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.891517 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-ovnkube-config\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.891517 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-cnibin\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.891517 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891331 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5825282c-f2bb-4812-ae37-269c52c423f7-multus-daemon-config\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.891517 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.891347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:14.894497 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.894469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-socket-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.895447 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.895170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-etc-selinux\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.895566 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.895523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r29d8\" (UniqueName: \"kubernetes.io/projected/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-kube-api-access-r29d8\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.895646 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.895610 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-system-cni-dir\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:14.895722 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.895644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a695a79-10fa-428f-95e4-f0cdc8dea701-host-slash\") pod \"iptables-alerter-bqt6x\" (UID: \"9a695a79-10fa-428f-95e4-f0cdc8dea701\") " pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:14.895840 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.895803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/141084a6-6657-4416-a7f1-21339dfd8b0a-serviceca\") pod \"node-ca-wxqd8\" (UID: \"141084a6-6657-4416-a7f1-21339dfd8b0a\") " pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:14.895899 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.895842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nwh7\" (UniqueName: \"kubernetes.io/projected/141084a6-6657-4416-a7f1-21339dfd8b0a-kube-api-access-6nwh7\") pod \"node-ca-wxqd8\" (UID: \"141084a6-6657-4416-a7f1-21339dfd8b0a\") " pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:14.895899 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.895891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftflr\" (UniqueName: \"kubernetes.io/projected/c578d9f0-5deb-4aba-b916-5f7c6bec807d-kube-api-access-ftflr\") pod \"node-resolver-lrfkm\" (UID: \"c578d9f0-5deb-4aba-b916-5f7c6bec807d\") " pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:14.896065 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.895924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-systemd-units\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.896146 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-run-openvswitch\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.896146 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896107 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-var-lib-kubelet\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.896293 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-registration-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.896293 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-modprobe-d\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.896293 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d1855494-351c-453b-b8fb-c603cdf8d73a-agent-certs\") pod \"konnectivity-agent-xn5kt\" (UID: \"d1855494-351c-453b-b8fb-c603cdf8d73a\") " pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:14.896293 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-node-log\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.896531 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-sysconfig\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.896531 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-kubernetes\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.896531 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-var-lib-kubelet\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.896531 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-ovnkube-script-lib\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.896761 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896547 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-run-multus-certs\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.896761 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:14.896761 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-device-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.896761 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kc8x\" (UniqueName: \"kubernetes.io/projected/ea346c91-9370-4cf0-be0a-d3d37d74e482-kube-api-access-4kc8x\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.896761 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896683 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-tuned\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.896761 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d1855494-351c-453b-b8fb-c603cdf8d73a-konnectivity-ca\") pod \"konnectivity-agent-xn5kt\" (UID: \"d1855494-351c-453b-b8fb-c603cdf8d73a\") " pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:14.897115 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9dz6\" (UniqueName: \"kubernetes.io/projected/3d6308be-3783-4dbc-bf78-b2d0765d73c8-kube-api-access-w9dz6\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:14.897115 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-var-lib-cni-bin\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.897115 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-hostroot\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.897115 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/141084a6-6657-4416-a7f1-21339dfd8b0a-host\") pod \"node-ca-wxqd8\" (UID: \"141084a6-6657-4416-a7f1-21339dfd8b0a\") " pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:14.897115 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-var-lib-openvswitch\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.897115 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.896980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-etc-openvswitch\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.897115 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897100 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7vp\" (UniqueName: \"kubernetes.io/projected/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-kube-api-access-hk7vp\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxvgb\" (UniqueName: \"kubernetes.io/projected/9ed89eb9-a1af-4733-9f98-72c27e607f22-kube-api-access-cxvgb\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897172 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-os-release\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3d6308be-3783-4dbc-bf78-b2d0765d73c8-cni-binary-copy\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3d6308be-3783-4dbc-bf78-b2d0765d73c8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-systemd\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ed89eb9-a1af-4733-9f98-72c27e607f22-tmp\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-kubelet\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-ovn-node-metrics-cert\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5825282c-f2bb-4812-ae37-269c52c423f7-cni-binary-copy\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-sysctl-conf\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.897451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c578d9f0-5deb-4aba-b916-5f7c6bec807d-hosts-file\") pod \"node-resolver-lrfkm\" (UID: \"c578d9f0-5deb-4aba-b916-5f7c6bec807d\") " pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:14.897962 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c578d9f0-5deb-4aba-b916-5f7c6bec807d-tmp-dir\") pod \"node-resolver-lrfkm\" (UID: \"c578d9f0-5deb-4aba-b916-5f7c6bec807d\") " pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:14.897962 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897498 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-run-systemd\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.897962 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897527 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-log-socket\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.897962 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-cni-netd\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.897962 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.897581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-lib-modules\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.901206 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.901185 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:26:14.905096 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.905073 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:21:13 +0000 UTC" deadline="2027-09-23 19:41:47.407351308 +0000 UTC" Apr 17 17:26:14.905144 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.905096 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12578h15m32.502257159s" Apr 17 17:26:14.917612 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.917593 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-btgpj" Apr 17 17:26:14.924976 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.924953 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-btgpj" Apr 17 17:26:14.997861 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.997832 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-kubelet\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.997861 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.997863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-ovn-node-metrics-cert\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998107 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.997881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5825282c-f2bb-4812-ae37-269c52c423f7-cni-binary-copy\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.998107 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.997901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-sysctl-conf\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.998107 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.997951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c578d9f0-5deb-4aba-b916-5f7c6bec807d-hosts-file\") pod \"node-resolver-lrfkm\" (UID: \"c578d9f0-5deb-4aba-b916-5f7c6bec807d\") " pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:14.998107 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.997988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c578d9f0-5deb-4aba-b916-5f7c6bec807d-tmp-dir\") pod \"node-resolver-lrfkm\" (UID: \"c578d9f0-5deb-4aba-b916-5f7c6bec807d\") " pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:14.998107 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998013 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-run-systemd\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998107 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998075 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-run-systemd\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998094 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c578d9f0-5deb-4aba-b916-5f7c6bec807d-hosts-file\") pod \"node-resolver-lrfkm\" (UID: \"c578d9f0-5deb-4aba-b916-5f7c6bec807d\") " pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-log-socket\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-cni-netd\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-lib-modules\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-run-netns\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998224 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-sysctl-conf\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998227 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-log-socket\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998252 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-cni-netd\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-var-lib-cni-multus\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998298 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-etc-kubernetes\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-run-netns\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-etc-kubernetes\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-sysctl-d\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-lib-modules\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.998384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-var-lib-cni-multus\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998392 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgccz\" (UniqueName: \"kubernetes.io/projected/9a695a79-10fa-428f-95e4-f0cdc8dea701-kube-api-access-kgccz\") pod \"iptables-alerter-bqt6x\" (UID: \"9a695a79-10fa-428f-95e4-f0cdc8dea701\") " pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-sys-fs\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-sys\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-host\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-sysctl-d\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-cni-bin\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-sys\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-env-overrides\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-host\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-multus-cni-dir\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998586 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-sys-fs\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-cni-bin\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-multus-cni-dir\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-multus-socket-dir-parent\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998659 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-kubelet\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-run-netns\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:14.999217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998703 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-multus-socket-dir-parent\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-os-release\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-run-netns\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998773 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-multus-conf-dir\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9a695a79-10fa-428f-95e4-f0cdc8dea701-iptables-alerter-script\") pod \"iptables-alerter-bqt6x\" (UID: \"9a695a79-10fa-428f-95e4-f0cdc8dea701\") " pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-os-release\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-run\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-run-ovn\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-cnibin\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998891 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-run\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3d6308be-3783-4dbc-bf78-b2d0765d73c8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998933 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-run-ovn\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-env-overrides\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-system-cni-dir\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.000232 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998933 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-system-cni-dir\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.998971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-cnibin\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999042 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-multus-conf-dir\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-run-k8s-cni-cncf-io\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999097 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klwdm\" (UniqueName: \"kubernetes.io/projected/5825282c-f2bb-4812-ae37-269c52c423f7-kube-api-access-klwdm\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-run-k8s-cni-cncf-io\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-slash\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-ovnkube-config\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-cnibin\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5825282c-f2bb-4812-ae37-269c52c423f7-multus-daemon-config\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-socket-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999242 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-etc-selinux\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999258 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r29d8\" (UniqueName: \"kubernetes.io/projected/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-kube-api-access-r29d8\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-system-cni-dir\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.001073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a695a79-10fa-428f-95e4-f0cdc8dea701-host-slash\") pod \"iptables-alerter-bqt6x\" (UID: \"9a695a79-10fa-428f-95e4-f0cdc8dea701\") " pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/141084a6-6657-4416-a7f1-21339dfd8b0a-serviceca\") pod \"node-ca-wxqd8\" (UID: \"141084a6-6657-4416-a7f1-21339dfd8b0a\") " pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nwh7\" (UniqueName: \"kubernetes.io/projected/141084a6-6657-4416-a7f1-21339dfd8b0a-kube-api-access-6nwh7\") pod \"node-ca-wxqd8\" (UID: \"141084a6-6657-4416-a7f1-21339dfd8b0a\") " pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/9a695a79-10fa-428f-95e4-f0cdc8dea701-iptables-alerter-script\") pod \"iptables-alerter-bqt6x\" (UID: \"9a695a79-10fa-428f-95e4-f0cdc8dea701\") " pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftflr\" (UniqueName: \"kubernetes.io/projected/c578d9f0-5deb-4aba-b916-5f7c6bec807d-kube-api-access-ftflr\") pod \"node-resolver-lrfkm\" (UID: \"c578d9f0-5deb-4aba-b916-5f7c6bec807d\") " pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-systemd-units\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-run-openvswitch\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-var-lib-kubelet\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3d6308be-3783-4dbc-bf78-b2d0765d73c8-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999510 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5825282c-f2bb-4812-ae37-269c52c423f7-cni-binary-copy\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-registration-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-modprobe-d\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d1855494-351c-453b-b8fb-c603cdf8d73a-agent-certs\") pod \"konnectivity-agent-xn5kt\" (UID: \"d1855494-351c-453b-b8fb-c603cdf8d73a\") " pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-node-log\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-sysconfig\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-kubernetes\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-cnibin\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.001811 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-var-lib-kubelet\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-ovnkube-script-lib\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-run-multus-certs\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-device-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4kc8x\" (UniqueName: \"kubernetes.io/projected/ea346c91-9370-4cf0-be0a-d3d37d74e482-kube-api-access-4kc8x\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-tuned\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d1855494-351c-453b-b8fb-c603cdf8d73a-konnectivity-ca\") pod \"konnectivity-agent-xn5kt\" (UID: \"d1855494-351c-453b-b8fb-c603cdf8d73a\") " pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:14.999977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/141084a6-6657-4416-a7f1-21339dfd8b0a-serviceca\") pod \"node-ca-wxqd8\" (UID: \"141084a6-6657-4416-a7f1-21339dfd8b0a\") " pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9dz6\" (UniqueName: \"kubernetes.io/projected/3d6308be-3783-4dbc-bf78-b2d0765d73c8-kube-api-access-w9dz6\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-ovnkube-config\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-var-lib-cni-bin\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5825282c-f2bb-4812-ae37-269c52c423f7-multus-daemon-config\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-host-slash\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9a695a79-10fa-428f-95e4-f0cdc8dea701-host-slash\") pod \"iptables-alerter-bqt6x\" (UID: \"9a695a79-10fa-428f-95e4-f0cdc8dea701\") " pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-kubernetes\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-var-lib-kubelet\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.002392 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000143 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c578d9f0-5deb-4aba-b916-5f7c6bec807d-tmp-dir\") pod \"node-resolver-lrfkm\" (UID: \"c578d9f0-5deb-4aba-b916-5f7c6bec807d\") " pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000162 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-system-cni-dir\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000282 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-etc-selinux\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000328 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-run-multus-certs\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-systemd-units\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-hostroot\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000408 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-var-lib-kubelet\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.000416 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/141084a6-6657-4416-a7f1-21339dfd8b0a-host\") pod \"node-ca-wxqd8\" (UID: \"141084a6-6657-4416-a7f1-21339dfd8b0a\") " pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-var-lib-openvswitch\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-device-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.000486 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs podName:c56ede72-4e1e-4a75-9ebe-eabfdfcd2065 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:15.500466479 +0000 UTC m=+2.075814786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs") pod "network-metrics-daemon-l28wh" (UID: "c56ede72-4e1e-4a75-9ebe-eabfdfcd2065") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000416 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-socket-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-var-lib-openvswitch\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000503 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-modprobe-d\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ea346c91-9370-4cf0-be0a-d3d37d74e482-registration-dir\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000544 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-hostroot\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.003069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000553 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-run-openvswitch\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000584 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/141084a6-6657-4416-a7f1-21339dfd8b0a-host\") pod \"node-ca-wxqd8\" (UID: \"141084a6-6657-4416-a7f1-21339dfd8b0a\") " pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-etc-openvswitch\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7vp\" (UniqueName: \"kubernetes.io/projected/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-kube-api-access-hk7vp\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000634 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-ovnkube-script-lib\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-etc-openvswitch\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5825282c-f2bb-4812-ae37-269c52c423f7-host-var-lib-cni-bin\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000728 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxvgb\" (UniqueName: \"kubernetes.io/projected/9ed89eb9-a1af-4733-9f98-72c27e607f22-kube-api-access-cxvgb\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000753 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-sysconfig\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-os-release\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3d6308be-3783-4dbc-bf78-b2d0765d73c8-cni-binary-copy\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-node-log\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3d6308be-3783-4dbc-bf78-b2d0765d73c8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-systemd\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000921 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3d6308be-3783-4dbc-bf78-b2d0765d73c8-os-release\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ed89eb9-a1af-4733-9f98-72c27e607f22-tmp\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.000965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-systemd\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.003862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.001348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3d6308be-3783-4dbc-bf78-b2d0765d73c8-cni-binary-copy\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.004411 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.001385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d1855494-351c-453b-b8fb-c603cdf8d73a-konnectivity-ca\") pod \"konnectivity-agent-xn5kt\" (UID: \"d1855494-351c-453b-b8fb-c603cdf8d73a\") " pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:15.004411 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.001521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3d6308be-3783-4dbc-bf78-b2d0765d73c8-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.004411 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.002657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-ovn-node-metrics-cert\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.004411 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.003129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9ed89eb9-a1af-4733-9f98-72c27e607f22-etc-tuned\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.004411 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.003343 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9ed89eb9-a1af-4733-9f98-72c27e607f22-tmp\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.004411 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.003603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d1855494-351c-453b-b8fb-c603cdf8d73a-agent-certs\") pod \"konnectivity-agent-xn5kt\" (UID: \"d1855494-351c-453b-b8fb-c603cdf8d73a\") " pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:15.006816 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.006796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgccz\" (UniqueName: \"kubernetes.io/projected/9a695a79-10fa-428f-95e4-f0cdc8dea701-kube-api-access-kgccz\") pod \"iptables-alerter-bqt6x\" (UID: \"9a695a79-10fa-428f-95e4-f0cdc8dea701\") " pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:15.009249 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.009215 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:15.009418 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.009255 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:15.009418 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.009297 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mnw4k for pod openshift-network-diagnostics/network-check-target-494hr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:15.009606 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.009591 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k podName:cd2a6ac8-6e93-4277-9929-2c37daaabc22 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:15.509362621 +0000 UTC m=+2.084710910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mnw4k" (UniqueName: "kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k") pod "network-check-target-494hr" (UID: "cd2a6ac8-6e93-4277-9929-2c37daaabc22") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:15.010340 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.010317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klwdm\" (UniqueName: \"kubernetes.io/projected/5825282c-f2bb-4812-ae37-269c52c423f7-kube-api-access-klwdm\") pod \"multus-f76vg\" (UID: \"5825282c-f2bb-4812-ae37-269c52c423f7\") " pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.011185 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.011159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r29d8\" (UniqueName: \"kubernetes.io/projected/47b24c49-3baa-47b7-9b2e-e0fd6d27367d-kube-api-access-r29d8\") pod \"ovnkube-node-d5gqt\" (UID: \"47b24c49-3baa-47b7-9b2e-e0fd6d27367d\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.012805 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.012779 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftflr\" (UniqueName: \"kubernetes.io/projected/c578d9f0-5deb-4aba-b916-5f7c6bec807d-kube-api-access-ftflr\") pod \"node-resolver-lrfkm\" (UID: \"c578d9f0-5deb-4aba-b916-5f7c6bec807d\") " pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:15.012899 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.012807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nwh7\" (UniqueName: \"kubernetes.io/projected/141084a6-6657-4416-a7f1-21339dfd8b0a-kube-api-access-6nwh7\") pod \"node-ca-wxqd8\" (UID: \"141084a6-6657-4416-a7f1-21339dfd8b0a\") " pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:15.013070 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.013053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kc8x\" (UniqueName: \"kubernetes.io/projected/ea346c91-9370-4cf0-be0a-d3d37d74e482-kube-api-access-4kc8x\") pod \"aws-ebs-csi-driver-node-hbfgf\" (UID: \"ea346c91-9370-4cf0-be0a-d3d37d74e482\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.013309 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.013289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7vp\" (UniqueName: \"kubernetes.io/projected/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-kube-api-access-hk7vp\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:15.013995 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.013950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9dz6\" (UniqueName: \"kubernetes.io/projected/3d6308be-3783-4dbc-bf78-b2d0765d73c8-kube-api-access-w9dz6\") pod \"multus-additional-cni-plugins-tpw89\" (UID: \"3d6308be-3783-4dbc-bf78-b2d0765d73c8\") " pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.014341 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.014322 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxvgb\" (UniqueName: \"kubernetes.io/projected/9ed89eb9-a1af-4733-9f98-72c27e607f22-kube-api-access-cxvgb\") pod \"tuned-fmj2l\" (UID: \"9ed89eb9-a1af-4733-9f98-72c27e607f22\") " pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.025469 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.025436 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:15.032451 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.032424 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lrfkm" Apr 17 17:26:15.117689 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.117654 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb9baa965611dba5e7069bbf2b1b2600.slice/crio-87fad8dd1938232d475c9866e211b998d0b6d6fa36e4f40ea1bac69a99e22631 WatchSource:0}: Error finding container 87fad8dd1938232d475c9866e211b998d0b6d6fa36e4f40ea1bac69a99e22631: Status 404 returned error can't find the container with id 87fad8dd1938232d475c9866e211b998d0b6d6fa36e4f40ea1bac69a99e22631 Apr 17 17:26:15.118003 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.117981 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a191cd1b01780dbedd511a0221fb1c6.slice/crio-ca27ef08a0fbc0cf5091ce4ac0083f1ac112cd948949d84a35cc0a5235126a15 WatchSource:0}: Error finding container ca27ef08a0fbc0cf5091ce4ac0083f1ac112cd948949d84a35cc0a5235126a15: Status 404 returned error can't find the container with id ca27ef08a0fbc0cf5091ce4ac0083f1ac112cd948949d84a35cc0a5235126a15 Apr 17 17:26:15.122959 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.122942 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:26:15.175574 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.175545 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:15.202450 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.202324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:15.208378 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.208352 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47b24c49_3baa_47b7_9b2e_e0fd6d27367d.slice/crio-df88e761d12e93fce4ecfc364efee14ac384c366cc6125bd6a1f2ddcf59ac274 WatchSource:0}: Error finding container df88e761d12e93fce4ecfc364efee14ac384c366cc6125bd6a1f2ddcf59ac274: Status 404 returned error can't find the container with id df88e761d12e93fce4ecfc364efee14ac384c366cc6125bd6a1f2ddcf59ac274 Apr 17 17:26:15.215336 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.215312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" Apr 17 17:26:15.221087 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.221064 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea346c91_9370_4cf0_be0a_d3d37d74e482.slice/crio-f79900d50d907b5f558fd4c313405f612db57af76018633f29bdd18a4d3dbaeb WatchSource:0}: Error finding container f79900d50d907b5f558fd4c313405f612db57af76018633f29bdd18a4d3dbaeb: Status 404 returned error can't find the container with id f79900d50d907b5f558fd4c313405f612db57af76018633f29bdd18a4d3dbaeb Apr 17 17:26:15.229376 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.229355 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" Apr 17 17:26:15.235960 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.235934 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ed89eb9_a1af_4733_9f98_72c27e607f22.slice/crio-ece9e3db22f543915e4d95096356357ac822f7b77269dc57a995981be4b6d190 WatchSource:0}: Error finding container ece9e3db22f543915e4d95096356357ac822f7b77269dc57a995981be4b6d190: Status 404 returned error can't find the container with id ece9e3db22f543915e4d95096356357ac822f7b77269dc57a995981be4b6d190 Apr 17 17:26:15.244455 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.244435 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wxqd8" Apr 17 17:26:15.252000 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.251960 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod141084a6_6657_4416_a7f1_21339dfd8b0a.slice/crio-2012ffcc1813001f0e768d7689c2746ac9a0f6a5cab7a14f1585262b3a28274b WatchSource:0}: Error finding container 2012ffcc1813001f0e768d7689c2746ac9a0f6a5cab7a14f1585262b3a28274b: Status 404 returned error can't find the container with id 2012ffcc1813001f0e768d7689c2746ac9a0f6a5cab7a14f1585262b3a28274b Apr 17 17:26:15.257952 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.257929 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tpw89" Apr 17 17:26:15.264442 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.264411 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d6308be_3783_4dbc_bf78_b2d0765d73c8.slice/crio-1c728f2c38d8068b399e749b9619d8609dbba1164a21626e3d0be311d1a4c2d7 WatchSource:0}: Error finding container 1c728f2c38d8068b399e749b9619d8609dbba1164a21626e3d0be311d1a4c2d7: Status 404 returned error can't find the container with id 1c728f2c38d8068b399e749b9619d8609dbba1164a21626e3d0be311d1a4c2d7 Apr 17 17:26:15.276337 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.276303 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f76vg" Apr 17 17:26:15.284324 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.284296 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5825282c_f2bb_4812_ae37_269c52c423f7.slice/crio-e51061f80789e2d6225ea45c3cc703d869562b89c6455bd2b0a1661a69796f3a WatchSource:0}: Error finding container e51061f80789e2d6225ea45c3cc703d869562b89c6455bd2b0a1661a69796f3a: Status 404 returned error can't find the container with id e51061f80789e2d6225ea45c3cc703d869562b89c6455bd2b0a1661a69796f3a Apr 17 17:26:15.290549 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.290523 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-bqt6x" Apr 17 17:26:15.296823 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.296795 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a695a79_10fa_428f_95e4_f0cdc8dea701.slice/crio-183292a464654855a0862ee44634e0c17633414cfa78844a2578c1c70e0f3d64 WatchSource:0}: Error finding container 183292a464654855a0862ee44634e0c17633414cfa78844a2578c1c70e0f3d64: Status 404 returned error can't find the container with id 183292a464654855a0862ee44634e0c17633414cfa78844a2578c1c70e0f3d64 Apr 17 17:26:15.311905 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.311878 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1855494_351c_453b_b8fb_c603cdf8d73a.slice/crio-1d57edc3231ccd9ae2befbef1c259d1f0059b3bc1c7fb320a5a017c8dc2e212c WatchSource:0}: Error finding container 1d57edc3231ccd9ae2befbef1c259d1f0059b3bc1c7fb320a5a017c8dc2e212c: Status 404 returned error can't find the container with id 1d57edc3231ccd9ae2befbef1c259d1f0059b3bc1c7fb320a5a017c8dc2e212c Apr 17 17:26:15.333129 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:15.333092 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc578d9f0_5deb_4aba_b916_5f7c6bec807d.slice/crio-d8f04a18d31883b4a6fcf6110b24f70b1ff213c4eaaf85ad958dd2b135c07818 WatchSource:0}: Error finding container d8f04a18d31883b4a6fcf6110b24f70b1ff213c4eaaf85ad958dd2b135c07818: Status 404 returned error can't find the container with id d8f04a18d31883b4a6fcf6110b24f70b1ff213c4eaaf85ad958dd2b135c07818 Apr 17 17:26:15.472361 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.472276 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:15.505305 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.505268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:15.505467 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.505451 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:15.505528 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.505516 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs podName:c56ede72-4e1e-4a75-9ebe-eabfdfcd2065 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:16.505501998 +0000 UTC m=+3.080850264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs") pod "network-metrics-daemon-l28wh" (UID: "c56ede72-4e1e-4a75-9ebe-eabfdfcd2065") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:15.605953 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.605915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:15.606166 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.606136 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:15.606166 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.606161 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:15.606276 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.606176 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mnw4k for pod openshift-network-diagnostics/network-check-target-494hr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:15.606276 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:15.606246 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k podName:cd2a6ac8-6e93-4277-9929-2c37daaabc22 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:16.60622491 +0000 UTC m=+3.181573192 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnw4k" (UniqueName: "kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k") pod "network-check-target-494hr" (UID: "cd2a6ac8-6e93-4277-9929-2c37daaabc22") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:15.928045 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.927886 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:21:14 +0000 UTC" deadline="2027-12-18 22:19:33.353813167 +0000 UTC" Apr 17 17:26:15.928045 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:15.927921 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14644h53m17.425896012s" Apr 17 17:26:16.042619 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.042522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xn5kt" event={"ID":"d1855494-351c-453b-b8fb-c603cdf8d73a","Type":"ContainerStarted","Data":"1d57edc3231ccd9ae2befbef1c259d1f0059b3bc1c7fb320a5a017c8dc2e212c"} Apr 17 17:26:16.067616 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.067553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bqt6x" event={"ID":"9a695a79-10fa-428f-95e4-f0cdc8dea701","Type":"ContainerStarted","Data":"183292a464654855a0862ee44634e0c17633414cfa78844a2578c1c70e0f3d64"} Apr 17 17:26:16.083098 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.082959 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f76vg" event={"ID":"5825282c-f2bb-4812-ae37-269c52c423f7","Type":"ContainerStarted","Data":"e51061f80789e2d6225ea45c3cc703d869562b89c6455bd2b0a1661a69796f3a"} Apr 17 17:26:16.111165 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.111123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpw89" event={"ID":"3d6308be-3783-4dbc-bf78-b2d0765d73c8","Type":"ContainerStarted","Data":"1c728f2c38d8068b399e749b9619d8609dbba1164a21626e3d0be311d1a4c2d7"} Apr 17 17:26:16.113316 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.113287 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wxqd8" event={"ID":"141084a6-6657-4416-a7f1-21339dfd8b0a","Type":"ContainerStarted","Data":"2012ffcc1813001f0e768d7689c2746ac9a0f6a5cab7a14f1585262b3a28274b"} Apr 17 17:26:16.123225 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.123164 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" event={"ID":"9ed89eb9-a1af-4733-9f98-72c27e607f22","Type":"ContainerStarted","Data":"ece9e3db22f543915e4d95096356357ac822f7b77269dc57a995981be4b6d190"} Apr 17 17:26:16.135750 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.135712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" event={"ID":"db9baa965611dba5e7069bbf2b1b2600","Type":"ContainerStarted","Data":"87fad8dd1938232d475c9866e211b998d0b6d6fa36e4f40ea1bac69a99e22631"} Apr 17 17:26:16.155810 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.155770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" event={"ID":"2a191cd1b01780dbedd511a0221fb1c6","Type":"ContainerStarted","Data":"ca27ef08a0fbc0cf5091ce4ac0083f1ac112cd948949d84a35cc0a5235126a15"} Apr 17 17:26:16.158681 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.158475 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:16.165036 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.164945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lrfkm" event={"ID":"c578d9f0-5deb-4aba-b916-5f7c6bec807d","Type":"ContainerStarted","Data":"d8f04a18d31883b4a6fcf6110b24f70b1ff213c4eaaf85ad958dd2b135c07818"} Apr 17 17:26:16.173888 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.173850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" event={"ID":"ea346c91-9370-4cf0-be0a-d3d37d74e482","Type":"ContainerStarted","Data":"f79900d50d907b5f558fd4c313405f612db57af76018633f29bdd18a4d3dbaeb"} Apr 17 17:26:16.197065 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.196962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerStarted","Data":"df88e761d12e93fce4ecfc364efee14ac384c366cc6125bd6a1f2ddcf59ac274"} Apr 17 17:26:16.518478 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.517677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:16.518478 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:16.517855 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:16.518478 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:16.517930 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs podName:c56ede72-4e1e-4a75-9ebe-eabfdfcd2065 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:18.517911527 +0000 UTC m=+5.093259801 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs") pod "network-metrics-daemon-l28wh" (UID: "c56ede72-4e1e-4a75-9ebe-eabfdfcd2065") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:16.618596 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.618556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:16.618757 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:16.618717 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:16.618757 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:16.618736 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:16.618757 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:16.618749 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mnw4k for pod openshift-network-diagnostics/network-check-target-494hr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:16.618918 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:16.618809 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k podName:cd2a6ac8-6e93-4277-9929-2c37daaabc22 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:18.618790613 +0000 UTC m=+5.194138898 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnw4k" (UniqueName: "kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k") pod "network-check-target-494hr" (UID: "cd2a6ac8-6e93-4277-9929-2c37daaabc22") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:16.928366 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.928266 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:21:14 +0000 UTC" deadline="2028-01-08 12:39:59.30656236 +0000 UTC" Apr 17 17:26:16.928366 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.928315 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15139h13m42.378251637s" Apr 17 17:26:16.996621 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.996580 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:16.996804 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:16.996728 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:16.997198 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:16.997179 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:16.997293 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:16.997274 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:17.568282 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:17.568155 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:26:18.533435 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:18.533396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:18.533898 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:18.533546 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:18.533898 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:18.533613 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs podName:c56ede72-4e1e-4a75-9ebe-eabfdfcd2065 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:22.533593776 +0000 UTC m=+9.108942062 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs") pod "network-metrics-daemon-l28wh" (UID: "c56ede72-4e1e-4a75-9ebe-eabfdfcd2065") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:18.633990 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:18.633932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:18.634142 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:18.634123 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:18.634196 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:18.634145 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:18.634196 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:18.634158 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mnw4k for pod openshift-network-diagnostics/network-check-target-494hr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:18.634305 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:18.634220 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k podName:cd2a6ac8-6e93-4277-9929-2c37daaabc22 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:22.634199541 +0000 UTC m=+9.209547823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnw4k" (UniqueName: "kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k") pod "network-check-target-494hr" (UID: "cd2a6ac8-6e93-4277-9929-2c37daaabc22") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:18.996629 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:18.996544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:18.996791 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:18.996675 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:18.997105 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:18.997065 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:18.997227 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:18.997159 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:20.997247 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:20.997207 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:20.997724 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:20.997207 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:20.997724 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:20.997394 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:20.997724 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:20.997523 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:22.561911 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:22.561866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:22.562401 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:22.562054 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:22.562401 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:22.562120 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs podName:c56ede72-4e1e-4a75-9ebe-eabfdfcd2065 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:30.562099846 +0000 UTC m=+17.137448116 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs") pod "network-metrics-daemon-l28wh" (UID: "c56ede72-4e1e-4a75-9ebe-eabfdfcd2065") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:22.662645 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:22.662610 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:22.662842 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:22.662784 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:22.662842 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:22.662804 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:22.662842 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:22.662815 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mnw4k for pod openshift-network-diagnostics/network-check-target-494hr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:22.663004 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:22.662868 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k podName:cd2a6ac8-6e93-4277-9929-2c37daaabc22 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:30.662851355 +0000 UTC m=+17.238199625 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnw4k" (UniqueName: "kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k") pod "network-check-target-494hr" (UID: "cd2a6ac8-6e93-4277-9929-2c37daaabc22") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:22.997438 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:22.997339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:22.997605 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:22.997346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:22.997605 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:22.997477 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:22.997605 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:22.997575 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:23.023086 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.023052 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dhvm5"] Apr 17 17:26:23.026318 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.026258 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.026468 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:23.026377 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:23.065313 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.065119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-dbus\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.065313 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.065201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-kubelet-config\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.065313 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.065248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.166889 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.166476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-kubelet-config\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.166889 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.166541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.166889 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.166588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-dbus\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.166889 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.166699 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-dbus\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.166889 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.166760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-kubelet-config\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.166889 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:23.166858 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:23.167238 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:23.166920 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret podName:e8cb7f89-2b7c-45fe-9d10-6a4afcec6700 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:23.666899097 +0000 UTC m=+10.242247378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret") pod "global-pull-secret-syncer-dhvm5" (UID: "e8cb7f89-2b7c-45fe-9d10-6a4afcec6700") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:23.671739 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:23.671644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:23.672235 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:23.671783 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:23.672235 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:23.671866 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret podName:e8cb7f89-2b7c-45fe-9d10-6a4afcec6700 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:24.671846481 +0000 UTC m=+11.247194768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret") pod "global-pull-secret-syncer-dhvm5" (UID: "e8cb7f89-2b7c-45fe-9d10-6a4afcec6700") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:24.681161 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:24.681014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:24.681161 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:24.681136 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:24.681706 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:24.681206 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret podName:e8cb7f89-2b7c-45fe-9d10-6a4afcec6700 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:26.681187538 +0000 UTC m=+13.256535809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret") pod "global-pull-secret-syncer-dhvm5" (UID: "e8cb7f89-2b7c-45fe-9d10-6a4afcec6700") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:24.996967 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:24.996865 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:24.996967 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:24.996891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:24.996967 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:24.996891 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:24.997255 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:24.997041 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:24.997521 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:24.997494 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:24.997636 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:24.997602 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:26.694131 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:26.694086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:26.694591 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:26.694233 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:26.694591 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:26.694320 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret podName:e8cb7f89-2b7c-45fe-9d10-6a4afcec6700 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:30.694298956 +0000 UTC m=+17.269647244 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret") pod "global-pull-secret-syncer-dhvm5" (UID: "e8cb7f89-2b7c-45fe-9d10-6a4afcec6700") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:26.996990 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:26.996907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:26.997154 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:26.996907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:26.997154 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:26.997042 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:26.997154 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:26.996907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:26.997154 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:26.997107 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:26.997318 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:26.997179 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:28.997328 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:28.997234 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:28.997328 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:28.997279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:28.997852 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:28.997389 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:28.997852 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:28.997446 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:28.997852 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:28.997475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:28.997852 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:28.997562 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:30.621393 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:30.621356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:30.621846 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.621505 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:30.621846 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.621587 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs podName:c56ede72-4e1e-4a75-9ebe-eabfdfcd2065 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:46.621566101 +0000 UTC m=+33.196914379 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs") pod "network-metrics-daemon-l28wh" (UID: "c56ede72-4e1e-4a75-9ebe-eabfdfcd2065") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:26:30.722216 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:30.722164 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:30.722422 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:30.722263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:30.722422 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.722346 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:30.722422 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.722383 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:30.722422 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.722411 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:30.722422 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.722425 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mnw4k for pod openshift-network-diagnostics/network-check-target-494hr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:30.722680 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.722451 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret podName:e8cb7f89-2b7c-45fe-9d10-6a4afcec6700 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:38.722429263 +0000 UTC m=+25.297777529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret") pod "global-pull-secret-syncer-dhvm5" (UID: "e8cb7f89-2b7c-45fe-9d10-6a4afcec6700") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:30.722680 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.722477 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k podName:cd2a6ac8-6e93-4277-9929-2c37daaabc22 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:46.722460708 +0000 UTC m=+33.297808974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnw4k" (UniqueName: "kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k") pod "network-check-target-494hr" (UID: "cd2a6ac8-6e93-4277-9929-2c37daaabc22") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:30.997540 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:30.997460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:30.997785 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:30.997461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:30.997785 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.997606 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:30.997785 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:30.997460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:30.997785 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.997663 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:30.997785 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:30.997776 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:32.997372 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:32.997331 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:32.997873 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:32.997346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:32.997873 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:32.997443 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:32.997873 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:32.997346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:32.997873 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:32.997520 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:32.997873 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:32.997590 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:34.235454 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.235117 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f76vg" event={"ID":"5825282c-f2bb-4812-ae37-269c52c423f7","Type":"ContainerStarted","Data":"8cfbbfe95bf6703ea377b9a05dc3a42d28216f256b23a4ecda297742c914eda4"} Apr 17 17:26:34.238321 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.238291 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" event={"ID":"9ed89eb9-a1af-4733-9f98-72c27e607f22","Type":"ContainerStarted","Data":"f28ae9c23236e5db40cd36b49c1d28198ce103596fdade9e8881eccbc653e031"} Apr 17 17:26:34.240407 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.240337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" event={"ID":"2a191cd1b01780dbedd511a0221fb1c6","Type":"ContainerStarted","Data":"4efa12e01a6d55b5099675fc2711c4e3070052d131beba4d86836dc6387faa9a"} Apr 17 17:26:34.244383 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.244353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:26:34.244754 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.244723 2576 generic.go:358] "Generic (PLEG): container finished" podID="47b24c49-3baa-47b7-9b2e-e0fd6d27367d" containerID="c7ce09e67782f4c8da99cbbf252022e3290cf8e5480ec2012708480bd3c9e0cf" exitCode=1 Apr 17 17:26:34.244859 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.244774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerStarted","Data":"b150daf28e53dc92ca477bcffb91be02d5c9994d71e1f4824b81024eb04ee7f3"} Apr 17 17:26:34.244859 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.244799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerStarted","Data":"c8d570639f297dd35dbc96aa001bd1383ea1ec140e0e79f66ff3219a9474a385"} Apr 17 17:26:34.244859 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.244814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerStarted","Data":"9e6837d957c35d05570bac95ef3f53c8250f7c0fad6a33d54abeb80a4d86ae71"} Apr 17 17:26:34.244859 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.244829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerStarted","Data":"e3e3fa3ca8c9211623ec8a500ccff930a9129018ce6e49883e85bef7524cdc54"} Apr 17 17:26:34.244859 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.244841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerDied","Data":"c7ce09e67782f4c8da99cbbf252022e3290cf8e5480ec2012708480bd3c9e0cf"} Apr 17 17:26:34.244859 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.244856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerStarted","Data":"a13a88c63741e28aa1003c71eef48125be7a94da8e0155b0371055db0da4002e"} Apr 17 17:26:34.251115 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.251057 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f76vg" podStartSLOduration=1.900357587 podStartE2EDuration="20.251017767s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:26:15.285887021 +0000 UTC m=+1.861235287" lastFinishedPulling="2026-04-17 17:26:33.636547187 +0000 UTC m=+20.211895467" observedRunningTime="2026-04-17 17:26:34.250268447 +0000 UTC m=+20.825616737" watchObservedRunningTime="2026-04-17 17:26:34.251017767 +0000 UTC m=+20.826366059" Apr 17 17:26:34.265580 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.265538 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fmj2l" podStartSLOduration=1.998554344 podStartE2EDuration="20.265523231s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:26:15.238146156 +0000 UTC m=+1.813494425" lastFinishedPulling="2026-04-17 17:26:33.505115031 +0000 UTC m=+20.080463312" observedRunningTime="2026-04-17 17:26:34.265313673 +0000 UTC m=+20.840661961" watchObservedRunningTime="2026-04-17 17:26:34.265523231 +0000 UTC m=+20.840871519" Apr 17 17:26:34.996987 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.996943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:34.996987 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.996967 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:34.997241 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:34.996944 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:34.997241 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:34.997079 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:34.997241 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:34.997203 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:34.997396 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:34.997303 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:35.248422 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.248319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xn5kt" event={"ID":"d1855494-351c-453b-b8fb-c603cdf8d73a","Type":"ContainerStarted","Data":"1f5be332d9f2fcfe2aea65d8bf2e96dae2823bfd7907e227a90d2fc275e10f42"} Apr 17 17:26:35.250461 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.250429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-bqt6x" event={"ID":"9a695a79-10fa-428f-95e4-f0cdc8dea701","Type":"ContainerStarted","Data":"5f2c314547375283bca79edd3656c844221a700ebfbac14f9d15eb70b4cab726"} Apr 17 17:26:35.253454 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.253420 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d6308be-3783-4dbc-bf78-b2d0765d73c8" containerID="c80589c03710f14fe2082a5fd387bcce86287ff6425e027dc5c5d6e30d017683" exitCode=0 Apr 17 17:26:35.253596 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.253511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpw89" event={"ID":"3d6308be-3783-4dbc-bf78-b2d0765d73c8","Type":"ContainerDied","Data":"c80589c03710f14fe2082a5fd387bcce86287ff6425e027dc5c5d6e30d017683"} Apr 17 17:26:35.255061 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.255016 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wxqd8" event={"ID":"141084a6-6657-4416-a7f1-21339dfd8b0a","Type":"ContainerStarted","Data":"b38a3715d6a7e3143232bef44a36b23d9a5c28e3a259012e752e7a4aeca50670"} Apr 17 17:26:35.256749 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.256724 2576 generic.go:358] "Generic (PLEG): container finished" podID="db9baa965611dba5e7069bbf2b1b2600" containerID="576237777ccf309c49688415ad6e293b33cab9ed59ba4e4213ef26546ee62541" exitCode=0 Apr 17 17:26:35.256818 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.256805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" event={"ID":"db9baa965611dba5e7069bbf2b1b2600","Type":"ContainerDied","Data":"576237777ccf309c49688415ad6e293b33cab9ed59ba4e4213ef26546ee62541"} Apr 17 17:26:35.258251 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.258214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lrfkm" event={"ID":"c578d9f0-5deb-4aba-b916-5f7c6bec807d","Type":"ContainerStarted","Data":"ae353e812e3782ceca02905d44a6fade7083ed248d1dcf0625b2db5a49a8917d"} Apr 17 17:26:35.259802 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.259773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" event={"ID":"ea346c91-9370-4cf0-be0a-d3d37d74e482","Type":"ContainerStarted","Data":"d6737806cf8648245dbf7935ff9bed12d42d67ce307bbb75eeabfd6a083a204c"} Apr 17 17:26:35.264454 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.264412 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-109.ec2.internal" podStartSLOduration=21.264398671 podStartE2EDuration="21.264398671s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:34.277685239 +0000 UTC m=+20.853033528" watchObservedRunningTime="2026-04-17 17:26:35.264398671 +0000 UTC m=+21.839746936" Apr 17 17:26:35.264954 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.264919 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xn5kt" podStartSLOduration=3.075347884 podStartE2EDuration="21.264906646s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:26:15.313851537 +0000 UTC m=+1.889199805" lastFinishedPulling="2026-04-17 17:26:33.503410294 +0000 UTC m=+20.078758567" observedRunningTime="2026-04-17 17:26:35.264519098 +0000 UTC m=+21.839867400" watchObservedRunningTime="2026-04-17 17:26:35.264906646 +0000 UTC m=+21.840254933" Apr 17 17:26:35.278424 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.278380 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wxqd8" podStartSLOduration=3.028118898 podStartE2EDuration="21.278361957s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:26:15.253510587 +0000 UTC m=+1.828858853" lastFinishedPulling="2026-04-17 17:26:33.503753646 +0000 UTC m=+20.079101912" observedRunningTime="2026-04-17 17:26:35.278150731 +0000 UTC m=+21.853499020" watchObservedRunningTime="2026-04-17 17:26:35.278361957 +0000 UTC m=+21.853710246" Apr 17 17:26:35.290795 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.290741 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-bqt6x" podStartSLOduration=3.084090291 podStartE2EDuration="21.290723533s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:26:15.298455884 +0000 UTC m=+1.873804150" lastFinishedPulling="2026-04-17 17:26:33.505089113 +0000 UTC m=+20.080437392" observedRunningTime="2026-04-17 17:26:35.289986356 +0000 UTC m=+21.865334645" watchObservedRunningTime="2026-04-17 17:26:35.290723533 +0000 UTC m=+21.866071821" Apr 17 17:26:35.334353 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.334132 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lrfkm" podStartSLOduration=3.16525373 podStartE2EDuration="21.334113646s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:26:15.334562868 +0000 UTC m=+1.909911134" lastFinishedPulling="2026-04-17 17:26:33.503422767 +0000 UTC m=+20.078771050" observedRunningTime="2026-04-17 17:26:35.321592382 +0000 UTC m=+21.896940671" watchObservedRunningTime="2026-04-17 17:26:35.334113646 +0000 UTC m=+21.909461935" Apr 17 17:26:35.336625 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.336599 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:26:35.961264 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.961210 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:35.964836 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.964341 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:26:35.336622069Z","UUID":"eb9bc06d-7a1d-4583-a577-69a755e4d1c1","Handler":null,"Name":"","Endpoint":""} Apr 17 17:26:35.966366 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.966337 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:26:35.966366 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:35.966372 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:26:36.264058 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:36.263911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" event={"ID":"db9baa965611dba5e7069bbf2b1b2600","Type":"ContainerStarted","Data":"58856f504158c56e309939a1b487727a9f6330e298572356f5c83ae1c0ad6a54"} Apr 17 17:26:36.268318 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:36.268284 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" event={"ID":"ea346c91-9370-4cf0-be0a-d3d37d74e482","Type":"ContainerStarted","Data":"40e85a0a1496f396de0fa7422e34b0fa2f9f5e43a20c5dbe01ddfe53b59713ef"} Apr 17 17:26:36.291123 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:36.291065 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-109.ec2.internal" podStartSLOduration=22.291044527 podStartE2EDuration="22.291044527s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:36.290991097 +0000 UTC m=+22.866339386" watchObservedRunningTime="2026-04-17 17:26:36.291044527 +0000 UTC m=+22.866392817" Apr 17 17:26:36.997227 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:36.997189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:36.997399 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:36.997189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:36.997399 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:36.997301 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:36.997399 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:36.997191 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:36.997517 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:36.997366 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:36.997517 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:36.997465 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:37.272013 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:37.271974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" event={"ID":"ea346c91-9370-4cf0-be0a-d3d37d74e482","Type":"ContainerStarted","Data":"00884da56c39a0f01ffcdbafe3a463998a62595cd96d713eaa204e8aee874263"} Apr 17 17:26:37.275127 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:37.275102 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:26:37.275545 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:37.275504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerStarted","Data":"f99704550d3418009427cdc63db8cf2be3c3bca510e22a3b6f9627c895ee103f"} Apr 17 17:26:37.289990 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:37.289943 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hbfgf" podStartSLOduration=2.244624642 podStartE2EDuration="23.289927429s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:26:15.222881947 +0000 UTC m=+1.798230213" lastFinishedPulling="2026-04-17 17:26:36.268184733 +0000 UTC m=+22.843533000" observedRunningTime="2026-04-17 17:26:37.289398995 +0000 UTC m=+23.864747282" watchObservedRunningTime="2026-04-17 17:26:37.289927429 +0000 UTC m=+23.865275714" Apr 17 17:26:38.053677 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:38.053639 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:38.054369 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:38.054344 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:38.277844 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:38.277821 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xn5kt" Apr 17 17:26:38.777922 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:38.777736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:38.778201 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:38.777927 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:38.778201 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:38.778043 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret podName:e8cb7f89-2b7c-45fe-9d10-6a4afcec6700 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:54.778008748 +0000 UTC m=+41.353357029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret") pod "global-pull-secret-syncer-dhvm5" (UID: "e8cb7f89-2b7c-45fe-9d10-6a4afcec6700") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:26:38.996971 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:38.996935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:38.996971 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:38.996955 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:38.997226 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:38.996935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:38.997226 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:38.997119 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:38.997226 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:38.997199 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:38.997367 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:38.997287 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:39.282984 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:39.282959 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:26:39.283446 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:39.283417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerStarted","Data":"fdc6957de7dc6b7517d63d488d84ea63a001943a8af9745f3b1224dfc4eb52ca"} Apr 17 17:26:39.283723 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:39.283695 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:39.283823 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:39.283728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:39.283883 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:39.283846 2576 scope.go:117] "RemoveContainer" containerID="c7ce09e67782f4c8da99cbbf252022e3290cf8e5480ec2012708480bd3c9e0cf" Apr 17 17:26:39.300726 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:39.300691 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:40.714856 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:40.714824 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l28wh"] Apr 17 17:26:40.715363 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:40.714956 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:40.715363 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:40.715101 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:40.723580 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:40.723530 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dhvm5"] Apr 17 17:26:40.723852 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:40.723692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:40.723852 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:40.723807 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:40.724375 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:40.724347 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-494hr"] Apr 17 17:26:40.724484 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:40.724473 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:40.724581 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:40.724559 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:41.289517 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:41.289484 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d6308be-3783-4dbc-bf78-b2d0765d73c8" containerID="283810aabc3c9ce06a2bf9bc374c0612e6f12afaba92164b4ada0cd894f30682" exitCode=0 Apr 17 17:26:41.289687 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:41.289574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpw89" event={"ID":"3d6308be-3783-4dbc-bf78-b2d0765d73c8","Type":"ContainerDied","Data":"283810aabc3c9ce06a2bf9bc374c0612e6f12afaba92164b4ada0cd894f30682"} Apr 17 17:26:41.293064 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:41.293043 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:26:41.293431 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:41.293345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" event={"ID":"47b24c49-3baa-47b7-9b2e-e0fd6d27367d","Type":"ContainerStarted","Data":"30b3994e8146585b6e56a27d872f574125095177da7dc4822b1df400e5be5811"} Apr 17 17:26:41.293634 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:41.293602 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:41.308201 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:41.308177 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:26:41.997291 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:41.997256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:41.997752 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:41.997363 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:41.997752 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:41.997431 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:41.997752 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:41.997591 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:42.996826 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:42.996604 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:42.996961 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:42.996857 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:43.299440 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:43.299406 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d6308be-3783-4dbc-bf78-b2d0765d73c8" containerID="180ffbffbed9628460c3b2e301111baddb6e67b62a1debd650db8c320bcd8299" exitCode=0 Apr 17 17:26:43.299874 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:43.299496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpw89" event={"ID":"3d6308be-3783-4dbc-bf78-b2d0765d73c8","Type":"ContainerDied","Data":"180ffbffbed9628460c3b2e301111baddb6e67b62a1debd650db8c320bcd8299"} Apr 17 17:26:43.310126 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:43.310081 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" podUID="47b24c49-3baa-47b7-9b2e-e0fd6d27367d" containerName="ovnkube-controller" probeResult="failure" output="" Apr 17 17:26:43.324960 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:43.324911 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" podStartSLOduration=10.990789317 podStartE2EDuration="29.324895868s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:26:15.209965088 +0000 UTC m=+1.785313358" lastFinishedPulling="2026-04-17 17:26:33.544071643 +0000 UTC m=+20.119419909" observedRunningTime="2026-04-17 17:26:41.367059775 +0000 UTC m=+27.942408063" watchObservedRunningTime="2026-04-17 17:26:43.324895868 +0000 UTC m=+29.900244152" Apr 17 17:26:43.997249 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:43.997221 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:43.997408 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:43.997306 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dhvm5" podUID="e8cb7f89-2b7c-45fe-9d10-6a4afcec6700" Apr 17 17:26:43.997408 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:43.997361 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:43.997515 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:43.997488 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:26:44.303833 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:44.303797 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d6308be-3783-4dbc-bf78-b2d0765d73c8" containerID="53fb2200523c3803289e4aac56664c94d1a4fc678f84d1d9c2294b4eae26b81b" exitCode=0 Apr 17 17:26:44.304418 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:44.303854 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpw89" event={"ID":"3d6308be-3783-4dbc-bf78-b2d0765d73c8","Type":"ContainerDied","Data":"53fb2200523c3803289e4aac56664c94d1a4fc678f84d1d9c2294b4eae26b81b"} Apr 17 17:26:44.997325 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:44.997280 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:44.997520 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:44.997423 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-494hr" podUID="cd2a6ac8-6e93-4277-9929-2c37daaabc22" Apr 17 17:26:45.735980 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.735897 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-109.ec2.internal" event="NodeReady" Apr 17 17:26:45.736424 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.736071 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:26:45.769720 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.769683 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-68866946cf-4l6t4"] Apr 17 17:26:45.792125 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.792082 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cgfbj"] Apr 17 17:26:45.792302 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.792269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.794966 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.794940 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:26:45.795398 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.795267 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:26:45.795398 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.795274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-pcbgd\"" Apr 17 17:26:45.795398 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.795274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:26:45.802153 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.802120 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:26:45.811318 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.811284 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68866946cf-4l6t4"] Apr 17 17:26:45.811318 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.811322 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-smpcv"] Apr 17 17:26:45.811531 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.811474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:45.814239 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.814210 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:26:45.814239 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.814227 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:26:45.814435 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.814212 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-29fk4\"" Apr 17 17:26:45.814435 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.814248 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:26:45.826587 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.826554 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-smpcv"] Apr 17 17:26:45.826587 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.826587 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cgfbj"] Apr 17 17:26:45.826778 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.826711 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:45.829280 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.829253 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:26:45.829280 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.829277 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:26:45.829478 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.829351 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dxpw7\"" Apr 17 17:26:45.835966 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.835940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-installation-pull-secrets\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.836124 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.836046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-bound-sa-token\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.836124 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.836084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd85j\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-kube-api-access-pd85j\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.836230 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.836186 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-image-registry-private-configuration\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.836284 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.836264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.836332 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.836297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/860ed815-44b1-4158-af77-fb201acf8cbf-ca-trust-extracted\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.836332 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.836322 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-trusted-ca\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.836416 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.836377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-registry-certificates\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.936951 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.936910 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7nh4\" (UniqueName: \"kubernetes.io/projected/e0d9b600-a7c0-458a-b964-3feb2fe753c5-kube-api-access-m7nh4\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:45.937170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.936976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.937170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/860ed815-44b1-4158-af77-fb201acf8cbf-ca-trust-extracted\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.937170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-trusted-ca\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.937170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-registry-certificates\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.937170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-installation-pull-secrets\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.937170 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:45.937136 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:26:45.937170 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:45.937157 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68866946cf-4l6t4: secret "image-registry-tls" not found Apr 17 17:26:45.937170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-bound-sa-token\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.937554 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:45.937554 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:45.937215 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls podName:860ed815-44b1-4158-af77-fb201acf8cbf nodeName:}" failed. No retries permitted until 2026-04-17 17:26:46.437195575 +0000 UTC m=+33.012543841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls") pod "image-registry-68866946cf-4l6t4" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf") : secret "image-registry-tls" not found Apr 17 17:26:45.937554 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd85j\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-kube-api-access-pd85j\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.937554 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hp67\" (UniqueName: \"kubernetes.io/projected/9e4659f9-8397-426b-a4db-39edf813f27a-kube-api-access-2hp67\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:45.937554 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-image-registry-private-configuration\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.937554 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0d9b600-a7c0-458a-b964-3feb2fe753c5-config-volume\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:45.937554 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0d9b600-a7c0-458a-b964-3feb2fe753c5-tmp-dir\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:45.937554 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:45.937554 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937488 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/860ed815-44b1-4158-af77-fb201acf8cbf-ca-trust-extracted\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.937974 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.937931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-registry-certificates\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.938167 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.938141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-trusted-ca\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.942707 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.942685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-image-registry-private-configuration\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.942815 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.942791 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-installation-pull-secrets\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.947651 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.947627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-bound-sa-token\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.947785 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.947764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd85j\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-kube-api-access-pd85j\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:45.997349 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.997268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:45.997349 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:45.997312 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:46.000553 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.000525 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:26:46.000702 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.000576 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q7kb6\"" Apr 17 17:26:46.000702 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.000587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:26:46.038275 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.038231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hp67\" (UniqueName: \"kubernetes.io/projected/9e4659f9-8397-426b-a4db-39edf813f27a-kube-api-access-2hp67\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:46.038456 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.038306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0d9b600-a7c0-458a-b964-3feb2fe753c5-config-volume\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:46.038456 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.038336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0d9b600-a7c0-458a-b964-3feb2fe753c5-tmp-dir\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:46.038456 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.038376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:46.038456 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.038396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7nh4\" (UniqueName: \"kubernetes.io/projected/e0d9b600-a7c0-458a-b964-3feb2fe753c5-kube-api-access-m7nh4\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:46.038664 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.038463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:46.038664 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.038566 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:46.038664 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.038566 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:46.038664 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.038655 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls podName:e0d9b600-a7c0-458a-b964-3feb2fe753c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:46.538622371 +0000 UTC m=+33.113970650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls") pod "dns-default-smpcv" (UID: "e0d9b600-a7c0-458a-b964-3feb2fe753c5") : secret "dns-default-metrics-tls" not found Apr 17 17:26:46.038858 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.038738 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert podName:9e4659f9-8397-426b-a4db-39edf813f27a nodeName:}" failed. No retries permitted until 2026-04-17 17:26:46.53871695 +0000 UTC m=+33.114065217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert") pod "ingress-canary-cgfbj" (UID: "9e4659f9-8397-426b-a4db-39edf813f27a") : secret "canary-serving-cert" not found Apr 17 17:26:46.038977 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.038965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e0d9b600-a7c0-458a-b964-3feb2fe753c5-config-volume\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:46.039077 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.038987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e0d9b600-a7c0-458a-b964-3feb2fe753c5-tmp-dir\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:46.047877 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.047847 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hp67\" (UniqueName: \"kubernetes.io/projected/9e4659f9-8397-426b-a4db-39edf813f27a-kube-api-access-2hp67\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:46.060072 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.060039 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7nh4\" (UniqueName: \"kubernetes.io/projected/e0d9b600-a7c0-458a-b964-3feb2fe753c5-kube-api-access-m7nh4\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:46.443151 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.443112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:46.443337 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.443278 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:26:46.443337 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.443300 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68866946cf-4l6t4: secret "image-registry-tls" not found Apr 17 17:26:46.443430 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.443364 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls podName:860ed815-44b1-4158-af77-fb201acf8cbf nodeName:}" failed. No retries permitted until 2026-04-17 17:26:47.443343273 +0000 UTC m=+34.018691550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls") pod "image-registry-68866946cf-4l6t4" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf") : secret "image-registry-tls" not found Apr 17 17:26:46.544344 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.544304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:46.544520 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.544472 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:46.544520 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.544503 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:46.544621 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.544550 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert podName:9e4659f9-8397-426b-a4db-39edf813f27a nodeName:}" failed. No retries permitted until 2026-04-17 17:26:47.544528681 +0000 UTC m=+34.119876949 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert") pod "ingress-canary-cgfbj" (UID: "9e4659f9-8397-426b-a4db-39edf813f27a") : secret "canary-serving-cert" not found Apr 17 17:26:46.544621 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.544612 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:46.544708 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.544665 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls podName:e0d9b600-a7c0-458a-b964-3feb2fe753c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:47.54464981 +0000 UTC m=+34.119998078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls") pod "dns-default-smpcv" (UID: "e0d9b600-a7c0-458a-b964-3feb2fe753c5") : secret "dns-default-metrics-tls" not found Apr 17 17:26:46.645752 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.645715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:26:46.645951 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.645886 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:26:46.646017 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.645994 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs podName:c56ede72-4e1e-4a75-9ebe-eabfdfcd2065 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:18.645972217 +0000 UTC m=+65.221320488 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs") pod "network-metrics-daemon-l28wh" (UID: "c56ede72-4e1e-4a75-9ebe-eabfdfcd2065") : secret "metrics-daemon-secret" not found Apr 17 17:26:46.746520 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.746434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:46.746978 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.746602 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:26:46.746978 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.746623 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:26:46.746978 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.746633 2576 projected.go:194] Error preparing data for projected volume kube-api-access-mnw4k for pod openshift-network-diagnostics/network-check-target-494hr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:46.746978 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:46.746690 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k podName:cd2a6ac8-6e93-4277-9929-2c37daaabc22 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:18.746673934 +0000 UTC m=+65.322022201 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mnw4k" (UniqueName: "kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k") pod "network-check-target-494hr" (UID: "cd2a6ac8-6e93-4277-9929-2c37daaabc22") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:26:46.996981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:46.996900 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:26:47.000156 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:47.000037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:26:47.000156 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:47.000037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:26:47.000366 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:47.000172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-r46n5\"" Apr 17 17:26:47.451543 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:47.451512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:47.451711 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:47.451685 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:26:47.451711 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:47.451705 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68866946cf-4l6t4: secret "image-registry-tls" not found Apr 17 17:26:47.451840 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:47.451764 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls podName:860ed815-44b1-4158-af77-fb201acf8cbf nodeName:}" failed. No retries permitted until 2026-04-17 17:26:49.451748363 +0000 UTC m=+36.027096632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls") pod "image-registry-68866946cf-4l6t4" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf") : secret "image-registry-tls" not found Apr 17 17:26:47.552546 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:47.552510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:47.552711 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:47.552579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:47.552711 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:47.552683 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:47.552845 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:47.552729 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert podName:9e4659f9-8397-426b-a4db-39edf813f27a nodeName:}" failed. No retries permitted until 2026-04-17 17:26:49.552715764 +0000 UTC m=+36.128064030 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert") pod "ingress-canary-cgfbj" (UID: "9e4659f9-8397-426b-a4db-39edf813f27a") : secret "canary-serving-cert" not found Apr 17 17:26:47.552845 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:47.552680 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:47.552845 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:47.552828 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls podName:e0d9b600-a7c0-458a-b964-3feb2fe753c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:49.552810437 +0000 UTC m=+36.128158705 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls") pod "dns-default-smpcv" (UID: "e0d9b600-a7c0-458a-b964-3feb2fe753c5") : secret "dns-default-metrics-tls" not found Apr 17 17:26:49.470238 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:49.470045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:49.470238 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:49.470234 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:26:49.470749 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:49.470260 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68866946cf-4l6t4: secret "image-registry-tls" not found Apr 17 17:26:49.470749 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:49.470356 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls podName:860ed815-44b1-4158-af77-fb201acf8cbf nodeName:}" failed. No retries permitted until 2026-04-17 17:26:53.470338933 +0000 UTC m=+40.045687200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls") pod "image-registry-68866946cf-4l6t4" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf") : secret "image-registry-tls" not found Apr 17 17:26:49.570752 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:49.570712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:49.570942 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:49.570813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:49.570942 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:49.570877 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:49.570942 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:49.570929 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:49.571104 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:49.570957 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls podName:e0d9b600-a7c0-458a-b964-3feb2fe753c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:26:53.570937378 +0000 UTC m=+40.146285661 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls") pod "dns-default-smpcv" (UID: "e0d9b600-a7c0-458a-b964-3feb2fe753c5") : secret "dns-default-metrics-tls" not found Apr 17 17:26:49.571104 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:49.570976 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert podName:9e4659f9-8397-426b-a4db-39edf813f27a nodeName:}" failed. No retries permitted until 2026-04-17 17:26:53.57096666 +0000 UTC m=+40.146314926 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert") pod "ingress-canary-cgfbj" (UID: "9e4659f9-8397-426b-a4db-39edf813f27a") : secret "canary-serving-cert" not found Apr 17 17:26:51.319902 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:51.319867 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d6308be-3783-4dbc-bf78-b2d0765d73c8" containerID="09ff1cbd220b4eabed72da4fee5749d43cdd74ee86412d7a93806a1f01f319e2" exitCode=0 Apr 17 17:26:51.320384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:51.319918 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpw89" event={"ID":"3d6308be-3783-4dbc-bf78-b2d0765d73c8","Type":"ContainerDied","Data":"09ff1cbd220b4eabed72da4fee5749d43cdd74ee86412d7a93806a1f01f319e2"} Apr 17 17:26:52.324470 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:52.324433 2576 generic.go:358] "Generic (PLEG): container finished" podID="3d6308be-3783-4dbc-bf78-b2d0765d73c8" containerID="efcbaf3824c84b35500ab29ae608916150c7532828aeb7c2f967dbbfe7e72fa0" exitCode=0 Apr 17 17:26:52.324866 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:52.324483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpw89" event={"ID":"3d6308be-3783-4dbc-bf78-b2d0765d73c8","Type":"ContainerDied","Data":"efcbaf3824c84b35500ab29ae608916150c7532828aeb7c2f967dbbfe7e72fa0"} Apr 17 17:26:53.328727 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:53.328694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tpw89" event={"ID":"3d6308be-3783-4dbc-bf78-b2d0765d73c8","Type":"ContainerStarted","Data":"8c83a6da82191eba21d424eaf388a40f00b84db8e0e119ed9c739bfbbb677fe2"} Apr 17 17:26:53.355827 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:53.355774 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tpw89" podStartSLOduration=4.365690516 podStartE2EDuration="39.355760024s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:26:15.265987657 +0000 UTC m=+1.841335924" lastFinishedPulling="2026-04-17 17:26:50.256057166 +0000 UTC m=+36.831405432" observedRunningTime="2026-04-17 17:26:53.354422188 +0000 UTC m=+39.929770476" watchObservedRunningTime="2026-04-17 17:26:53.355760024 +0000 UTC m=+39.931108312" Apr 17 17:26:53.504995 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:53.504956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:26:53.505174 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:53.505101 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:26:53.505174 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:53.505116 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68866946cf-4l6t4: secret "image-registry-tls" not found Apr 17 17:26:53.505174 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:53.505166 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls podName:860ed815-44b1-4158-af77-fb201acf8cbf nodeName:}" failed. No retries permitted until 2026-04-17 17:27:01.505151625 +0000 UTC m=+48.080499896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls") pod "image-registry-68866946cf-4l6t4" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf") : secret "image-registry-tls" not found Apr 17 17:26:53.606395 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:53.606303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:26:53.606537 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:53.606396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:26:53.606537 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:53.606449 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:26:53.606537 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:53.606510 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls podName:e0d9b600-a7c0-458a-b964-3feb2fe753c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:01.606494351 +0000 UTC m=+48.181842617 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls") pod "dns-default-smpcv" (UID: "e0d9b600-a7c0-458a-b964-3feb2fe753c5") : secret "dns-default-metrics-tls" not found Apr 17 17:26:53.606639 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:53.606534 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:26:53.606639 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:26:53.606585 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert podName:9e4659f9-8397-426b-a4db-39edf813f27a nodeName:}" failed. No retries permitted until 2026-04-17 17:27:01.606571755 +0000 UTC m=+48.181920026 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert") pod "ingress-canary-cgfbj" (UID: "9e4659f9-8397-426b-a4db-39edf813f27a") : secret "canary-serving-cert" not found Apr 17 17:26:54.814525 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:54.814488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:54.818093 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:54.818066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e8cb7f89-2b7c-45fe-9d10-6a4afcec6700-original-pull-secret\") pod \"global-pull-secret-syncer-dhvm5\" (UID: \"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700\") " pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:55.016080 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:55.016019 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dhvm5" Apr 17 17:26:55.165369 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:55.165337 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dhvm5"] Apr 17 17:26:55.171608 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:26:55.171577 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8cb7f89_2b7c_45fe_9d10_6a4afcec6700.slice/crio-90b7d2bcc57e5d13893fe834d74c22748e7fd0fb1db3b4349758955530208c4a WatchSource:0}: Error finding container 90b7d2bcc57e5d13893fe834d74c22748e7fd0fb1db3b4349758955530208c4a: Status 404 returned error can't find the container with id 90b7d2bcc57e5d13893fe834d74c22748e7fd0fb1db3b4349758955530208c4a Apr 17 17:26:55.333706 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:26:55.333671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dhvm5" event={"ID":"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700","Type":"ContainerStarted","Data":"90b7d2bcc57e5d13893fe834d74c22748e7fd0fb1db3b4349758955530208c4a"} Apr 17 17:27:00.345840 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:00.345740 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dhvm5" event={"ID":"e8cb7f89-2b7c-45fe-9d10-6a4afcec6700","Type":"ContainerStarted","Data":"9dc19ba83b43bd875e4aa43db96b55d5256fdaf9b698e3f8a31e6b068aa946fd"} Apr 17 17:27:00.361264 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:00.361214 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dhvm5" podStartSLOduration=32.490288549 podStartE2EDuration="37.361198917s" podCreationTimestamp="2026-04-17 17:26:23 +0000 UTC" firstStartedPulling="2026-04-17 17:26:55.173272787 +0000 UTC m=+41.748621053" lastFinishedPulling="2026-04-17 17:27:00.044183152 +0000 UTC m=+46.619531421" observedRunningTime="2026-04-17 17:27:00.360926057 +0000 UTC m=+46.936274348" watchObservedRunningTime="2026-04-17 17:27:00.361198917 +0000 UTC m=+46.936547206" Apr 17 17:27:01.565845 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:01.565802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:27:01.566272 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:01.565954 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:27:01.566272 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:01.565974 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68866946cf-4l6t4: secret "image-registry-tls" not found Apr 17 17:27:01.566272 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:01.566055 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls podName:860ed815-44b1-4158-af77-fb201acf8cbf nodeName:}" failed. No retries permitted until 2026-04-17 17:27:17.566014706 +0000 UTC m=+64.141362972 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls") pod "image-registry-68866946cf-4l6t4" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf") : secret "image-registry-tls" not found Apr 17 17:27:01.666294 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:01.666253 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:27:01.666486 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:01.666343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:27:01.666486 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:01.666411 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:01.666486 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:01.666446 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:01.666486 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:01.666489 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls podName:e0d9b600-a7c0-458a-b964-3feb2fe753c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:17.666463544 +0000 UTC m=+64.241811813 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls") pod "dns-default-smpcv" (UID: "e0d9b600-a7c0-458a-b964-3feb2fe753c5") : secret "dns-default-metrics-tls" not found Apr 17 17:27:01.666656 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:01.666503 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert podName:9e4659f9-8397-426b-a4db-39edf813f27a nodeName:}" failed. No retries permitted until 2026-04-17 17:27:17.666497463 +0000 UTC m=+64.241845729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert") pod "ingress-canary-cgfbj" (UID: "9e4659f9-8397-426b-a4db-39edf813f27a") : secret "canary-serving-cert" not found Apr 17 17:27:13.309806 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:13.309778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5gqt" Apr 17 17:27:17.583433 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:17.583394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:27:17.583806 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:17.583554 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:27:17.583806 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:17.583573 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68866946cf-4l6t4: secret "image-registry-tls" not found Apr 17 17:27:17.583806 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:17.583629 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls podName:860ed815-44b1-4158-af77-fb201acf8cbf nodeName:}" failed. No retries permitted until 2026-04-17 17:27:49.583612891 +0000 UTC m=+96.158961156 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls") pod "image-registry-68866946cf-4l6t4" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf") : secret "image-registry-tls" not found Apr 17 17:27:17.684222 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:17.684168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:27:17.684397 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:17.684252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:27:17.684397 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:17.684318 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:17.684397 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:17.684347 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:17.684397 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:17.684385 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert podName:9e4659f9-8397-426b-a4db-39edf813f27a nodeName:}" failed. No retries permitted until 2026-04-17 17:27:49.684369428 +0000 UTC m=+96.259717699 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert") pod "ingress-canary-cgfbj" (UID: "9e4659f9-8397-426b-a4db-39edf813f27a") : secret "canary-serving-cert" not found Apr 17 17:27:17.684397 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:17.684399 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls podName:e0d9b600-a7c0-458a-b964-3feb2fe753c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:27:49.684393318 +0000 UTC m=+96.259741585 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls") pod "dns-default-smpcv" (UID: "e0d9b600-a7c0-458a-b964-3feb2fe753c5") : secret "dns-default-metrics-tls" not found Apr 17 17:27:18.690271 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:18.690230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:27:18.690667 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:18.690358 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:27:18.690667 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:18.690413 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs podName:c56ede72-4e1e-4a75-9ebe-eabfdfcd2065 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:22.690398336 +0000 UTC m=+129.265746602 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs") pod "network-metrics-daemon-l28wh" (UID: "c56ede72-4e1e-4a75-9ebe-eabfdfcd2065") : secret "metrics-daemon-secret" not found Apr 17 17:27:18.790740 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:18.790710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:27:18.793742 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:18.793723 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:27:18.803299 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:18.803276 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:27:18.813814 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:18.813784 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnw4k\" (UniqueName: \"kubernetes.io/projected/cd2a6ac8-6e93-4277-9929-2c37daaabc22-kube-api-access-mnw4k\") pod \"network-check-target-494hr\" (UID: \"cd2a6ac8-6e93-4277-9929-2c37daaabc22\") " pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:27:19.110804 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:19.110773 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-r46n5\"" Apr 17 17:27:19.119215 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:19.119182 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:27:19.236539 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:19.236504 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-494hr"] Apr 17 17:27:19.239725 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:27:19.239693 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd2a6ac8_6e93_4277_9929_2c37daaabc22.slice/crio-ce1fba257fe0e99a2fe53f6d31af1d9470bcf538125c0f47144f526056e788bb WatchSource:0}: Error finding container ce1fba257fe0e99a2fe53f6d31af1d9470bcf538125c0f47144f526056e788bb: Status 404 returned error can't find the container with id ce1fba257fe0e99a2fe53f6d31af1d9470bcf538125c0f47144f526056e788bb Apr 17 17:27:19.382988 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:19.382902 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-494hr" event={"ID":"cd2a6ac8-6e93-4277-9929-2c37daaabc22","Type":"ContainerStarted","Data":"ce1fba257fe0e99a2fe53f6d31af1d9470bcf538125c0f47144f526056e788bb"} Apr 17 17:27:22.391158 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:22.391121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-494hr" event={"ID":"cd2a6ac8-6e93-4277-9929-2c37daaabc22","Type":"ContainerStarted","Data":"47bc4c7d466233edce1e534a1b4029b1e388a64964634b495e2e55ab82a09150"} Apr 17 17:27:22.391549 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:22.391253 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:27:22.431707 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:22.431657 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-494hr" podStartSLOduration=65.795217957 podStartE2EDuration="1m8.431644194s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:27:19.241290916 +0000 UTC m=+65.816639182" lastFinishedPulling="2026-04-17 17:27:21.877717151 +0000 UTC m=+68.453065419" observedRunningTime="2026-04-17 17:27:22.431186445 +0000 UTC m=+69.006534730" watchObservedRunningTime="2026-04-17 17:27:22.431644194 +0000 UTC m=+69.006992482" Apr 17 17:27:49.604671 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:49.604535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:27:49.605178 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:49.604688 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:27:49.605178 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:49.604709 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68866946cf-4l6t4: secret "image-registry-tls" not found Apr 17 17:27:49.605178 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:49.604765 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls podName:860ed815-44b1-4158-af77-fb201acf8cbf nodeName:}" failed. No retries permitted until 2026-04-17 17:28:53.604747693 +0000 UTC m=+160.180095959 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls") pod "image-registry-68866946cf-4l6t4" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf") : secret "image-registry-tls" not found Apr 17 17:27:49.705396 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:49.705345 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:27:49.705555 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:49.705438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:27:49.705555 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:49.705505 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:27:49.705555 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:49.705514 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:27:49.705650 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:49.705586 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls podName:e0d9b600-a7c0-458a-b964-3feb2fe753c5 nodeName:}" failed. No retries permitted until 2026-04-17 17:28:53.70556431 +0000 UTC m=+160.280912643 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls") pod "dns-default-smpcv" (UID: "e0d9b600-a7c0-458a-b964-3feb2fe753c5") : secret "dns-default-metrics-tls" not found Apr 17 17:27:49.705650 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:27:49.705600 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert podName:9e4659f9-8397-426b-a4db-39edf813f27a nodeName:}" failed. No retries permitted until 2026-04-17 17:28:53.705594457 +0000 UTC m=+160.280942724 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert") pod "ingress-canary-cgfbj" (UID: "9e4659f9-8397-426b-a4db-39edf813f27a") : secret "canary-serving-cert" not found Apr 17 17:27:53.396264 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:27:53.396231 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-494hr" Apr 17 17:28:22.750766 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:22.750713 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:28:22.751303 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:22.750836 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:28:22.751303 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:22.750904 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs podName:c56ede72-4e1e-4a75-9ebe-eabfdfcd2065 nodeName:}" failed. No retries permitted until 2026-04-17 17:30:24.750890973 +0000 UTC m=+251.326239238 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs") pod "network-metrics-daemon-l28wh" (UID: "c56ede72-4e1e-4a75-9ebe-eabfdfcd2065") : secret "metrics-daemon-secret" not found Apr 17 17:28:27.715082 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.715050 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps"] Apr 17 17:28:27.717923 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.717895 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps" Apr 17 17:28:27.718748 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.718725 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb"] Apr 17 17:28:27.720652 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.720630 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-sj94z\"" Apr 17 17:28:27.720764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.720637 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:28:27.720764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.720667 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 17:28:27.721287 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.721272 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-vz885"] Apr 17 17:28:27.721453 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.721437 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:27.723720 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.723691 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-722rn"] Apr 17 17:28:27.723835 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.723786 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.723835 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.723822 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 17 17:28:27.724905 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.724880 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:28:27.725233 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.725211 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-t22zs\"" Apr 17 17:28:27.726715 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.725274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 17 17:28:27.728171 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.728150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:28:27.728357 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.728321 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-wdtl5\"" Apr 17 17:28:27.728521 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.728207 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 17:28:27.728649 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.728258 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:28:27.728722 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.728155 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 17:28:27.732063 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.731227 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps"] Apr 17 17:28:27.732063 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.731320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.733707 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.733685 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-vz885"] Apr 17 17:28:27.734357 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.734338 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:28:27.734488 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.734413 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kcqnk\"" Apr 17 17:28:27.734588 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.734366 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 17:28:27.734588 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.734344 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 17:28:27.734742 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.734391 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 17:28:27.734742 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.734755 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-722rn"] Apr 17 17:28:27.737078 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.737053 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 17:28:27.739003 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.738969 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 17:28:27.745193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.745170 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb"] Apr 17 17:28:27.814564 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.814524 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9"] Apr 17 17:28:27.817544 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.817524 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:27.820843 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.820823 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 17:28:27.821166 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.821148 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 17:28:27.821379 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.821326 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 17:28:27.821548 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.821530 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:28:27.822111 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.822096 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-zxf26\"" Apr 17 17:28:27.827755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.827734 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9"] Apr 17 17:28:27.888159 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwnw\" (UniqueName: \"kubernetes.io/projected/59eaa364-77a3-4e8d-b694-75e3c4a185b8-kube-api-access-dqwnw\") pod \"volume-data-source-validator-7c6cbb6c87-k7wps\" (UID: \"59eaa364-77a3-4e8d-b694-75e3c4a185b8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps" Apr 17 17:28:27.888159 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888167 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:27.888375 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32d852f3-1d88-49d6-93e3-d36b1a499102-service-ca-bundle\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.888375 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/32d852f3-1d88-49d6-93e3-d36b1a499102-snapshots\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.888375 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32d852f3-1d88-49d6-93e3-d36b1a499102-serving-cert\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.888375 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888324 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chx87\" (UniqueName: \"kubernetes.io/projected/32d852f3-1d88-49d6-93e3-d36b1a499102-kube-api-access-chx87\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.888375 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888342 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rmff\" (UniqueName: \"kubernetes.io/projected/563dc28b-3cac-43af-adc2-31d0202d2905-kube-api-access-5rmff\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.888375 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/563dc28b-3cac-43af-adc2-31d0202d2905-config\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.888552 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32d852f3-1d88-49d6-93e3-d36b1a499102-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.888552 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563dc28b-3cac-43af-adc2-31d0202d2905-serving-cert\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.888552 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888490 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/32d852f3-1d88-49d6-93e3-d36b1a499102-tmp\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.888552 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/563dc28b-3cac-43af-adc2-31d0202d2905-trusted-ca\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.888661 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.888554 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npckt\" (UniqueName: \"kubernetes.io/projected/adab37b6-8e36-456e-ac19-5ae45ffa56bd-kube-api-access-npckt\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:27.989572 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/32d852f3-1d88-49d6-93e3-d36b1a499102-tmp\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.989572 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef3c872-c400-4d98-9028-72e95653a455-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hp4p9\" (UID: \"4ef3c872-c400-4d98-9028-72e95653a455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:27.989755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/563dc28b-3cac-43af-adc2-31d0202d2905-trusted-ca\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.989755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-npckt\" (UniqueName: \"kubernetes.io/projected/adab37b6-8e36-456e-ac19-5ae45ffa56bd-kube-api-access-npckt\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:27.989755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96p48\" (UniqueName: \"kubernetes.io/projected/4ef3c872-c400-4d98-9028-72e95653a455-kube-api-access-96p48\") pod \"kube-storage-version-migrator-operator-6769c5d45-hp4p9\" (UID: \"4ef3c872-c400-4d98-9028-72e95653a455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:27.989755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989723 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dqwnw\" (UniqueName: \"kubernetes.io/projected/59eaa364-77a3-4e8d-b694-75e3c4a185b8-kube-api-access-dqwnw\") pod \"volume-data-source-validator-7c6cbb6c87-k7wps\" (UID: \"59eaa364-77a3-4e8d-b694-75e3c4a185b8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps" Apr 17 17:28:27.989755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:27.989927 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989764 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/32d852f3-1d88-49d6-93e3-d36b1a499102-tmp\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.989927 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef3c872-c400-4d98-9028-72e95653a455-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hp4p9\" (UID: \"4ef3c872-c400-4d98-9028-72e95653a455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:27.989927 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32d852f3-1d88-49d6-93e3-d36b1a499102-service-ca-bundle\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.989927 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:27.989862 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:28:27.989927 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/32d852f3-1d88-49d6-93e3-d36b1a499102-snapshots\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.989927 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989904 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32d852f3-1d88-49d6-93e3-d36b1a499102-serving-cert\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.989927 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:27.989928 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls podName:adab37b6-8e36-456e-ac19-5ae45ffa56bd nodeName:}" failed. No retries permitted until 2026-04-17 17:28:28.489907862 +0000 UTC m=+135.065256129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bf8qb" (UID: "adab37b6-8e36-456e-ac19-5ae45ffa56bd") : secret "samples-operator-tls" not found Apr 17 17:28:27.990296 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chx87\" (UniqueName: \"kubernetes.io/projected/32d852f3-1d88-49d6-93e3-d36b1a499102-kube-api-access-chx87\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.990296 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.989987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rmff\" (UniqueName: \"kubernetes.io/projected/563dc28b-3cac-43af-adc2-31d0202d2905-kube-api-access-5rmff\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.990296 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.990015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/563dc28b-3cac-43af-adc2-31d0202d2905-config\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.990296 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.990107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32d852f3-1d88-49d6-93e3-d36b1a499102-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.990492 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.990347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563dc28b-3cac-43af-adc2-31d0202d2905-serving-cert\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.990492 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.990484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/32d852f3-1d88-49d6-93e3-d36b1a499102-snapshots\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.990595 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.990524 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/563dc28b-3cac-43af-adc2-31d0202d2905-trusted-ca\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.990595 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.990577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32d852f3-1d88-49d6-93e3-d36b1a499102-service-ca-bundle\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.990805 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.990781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/563dc28b-3cac-43af-adc2-31d0202d2905-config\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:27.991051 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.991014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32d852f3-1d88-49d6-93e3-d36b1a499102-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.992430 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.992405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32d852f3-1d88-49d6-93e3-d36b1a499102-serving-cert\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:27.992558 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:27.992542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563dc28b-3cac-43af-adc2-31d0202d2905-serving-cert\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:28.000437 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.000409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chx87\" (UniqueName: \"kubernetes.io/projected/32d852f3-1d88-49d6-93e3-d36b1a499102-kube-api-access-chx87\") pod \"insights-operator-585dfdc468-vz885\" (UID: \"32d852f3-1d88-49d6-93e3-d36b1a499102\") " pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:28.000630 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.000411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rmff\" (UniqueName: \"kubernetes.io/projected/563dc28b-3cac-43af-adc2-31d0202d2905-kube-api-access-5rmff\") pod \"console-operator-9d4b6777b-722rn\" (UID: \"563dc28b-3cac-43af-adc2-31d0202d2905\") " pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:28.000630 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.000459 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-npckt\" (UniqueName: \"kubernetes.io/projected/adab37b6-8e36-456e-ac19-5ae45ffa56bd-kube-api-access-npckt\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:28.000978 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.000960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqwnw\" (UniqueName: \"kubernetes.io/projected/59eaa364-77a3-4e8d-b694-75e3c4a185b8-kube-api-access-dqwnw\") pod \"volume-data-source-validator-7c6cbb6c87-k7wps\" (UID: \"59eaa364-77a3-4e8d-b694-75e3c4a185b8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps" Apr 17 17:28:28.032468 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.032421 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps" Apr 17 17:28:28.047398 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.047368 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-vz885" Apr 17 17:28:28.053110 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.053084 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:28.091212 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.091174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef3c872-c400-4d98-9028-72e95653a455-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hp4p9\" (UID: \"4ef3c872-c400-4d98-9028-72e95653a455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:28.091596 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.091569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96p48\" (UniqueName: \"kubernetes.io/projected/4ef3c872-c400-4d98-9028-72e95653a455-kube-api-access-96p48\") pod \"kube-storage-version-migrator-operator-6769c5d45-hp4p9\" (UID: \"4ef3c872-c400-4d98-9028-72e95653a455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:28.091718 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.091663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef3c872-c400-4d98-9028-72e95653a455-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hp4p9\" (UID: \"4ef3c872-c400-4d98-9028-72e95653a455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:28.092292 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.092236 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ef3c872-c400-4d98-9028-72e95653a455-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-hp4p9\" (UID: \"4ef3c872-c400-4d98-9028-72e95653a455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:28.094122 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.094074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ef3c872-c400-4d98-9028-72e95653a455-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-hp4p9\" (UID: \"4ef3c872-c400-4d98-9028-72e95653a455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:28.104521 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.101719 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96p48\" (UniqueName: \"kubernetes.io/projected/4ef3c872-c400-4d98-9028-72e95653a455-kube-api-access-96p48\") pod \"kube-storage-version-migrator-operator-6769c5d45-hp4p9\" (UID: \"4ef3c872-c400-4d98-9028-72e95653a455\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:28.130852 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.130810 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" Apr 17 17:28:28.174267 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.174193 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps"] Apr 17 17:28:28.196824 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.196788 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-vz885"] Apr 17 17:28:28.206713 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:28.206677 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32d852f3_1d88_49d6_93e3_d36b1a499102.slice/crio-007022b779fccb8adcc16dff5056344b569f39dd8b5f65d5a8e3763a4c8d3a0a WatchSource:0}: Error finding container 007022b779fccb8adcc16dff5056344b569f39dd8b5f65d5a8e3763a4c8d3a0a: Status 404 returned error can't find the container with id 007022b779fccb8adcc16dff5056344b569f39dd8b5f65d5a8e3763a4c8d3a0a Apr 17 17:28:28.220784 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.220751 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-722rn"] Apr 17 17:28:28.224936 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:28.224901 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod563dc28b_3cac_43af_adc2_31d0202d2905.slice/crio-0e52d297b02263672e81fd3169fd4b99a24513d45932a28777ead4417344b32d WatchSource:0}: Error finding container 0e52d297b02263672e81fd3169fd4b99a24513d45932a28777ead4417344b32d: Status 404 returned error can't find the container with id 0e52d297b02263672e81fd3169fd4b99a24513d45932a28777ead4417344b32d Apr 17 17:28:28.267336 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.267304 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9"] Apr 17 17:28:28.271191 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:28.271163 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ef3c872_c400_4d98_9028_72e95653a455.slice/crio-6e30cf450f5785d6b69a4b79cbe4cb59e3f854f7a9786061f0cc386ccc260d1e WatchSource:0}: Error finding container 6e30cf450f5785d6b69a4b79cbe4cb59e3f854f7a9786061f0cc386ccc260d1e: Status 404 returned error can't find the container with id 6e30cf450f5785d6b69a4b79cbe4cb59e3f854f7a9786061f0cc386ccc260d1e Apr 17 17:28:28.494585 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.494549 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:28.494766 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:28.494705 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:28:28.494809 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:28.494774 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls podName:adab37b6-8e36-456e-ac19-5ae45ffa56bd nodeName:}" failed. No retries permitted until 2026-04-17 17:28:29.494751975 +0000 UTC m=+136.070100241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bf8qb" (UID: "adab37b6-8e36-456e-ac19-5ae45ffa56bd") : secret "samples-operator-tls" not found Apr 17 17:28:28.518295 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.518204 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" event={"ID":"563dc28b-3cac-43af-adc2-31d0202d2905","Type":"ContainerStarted","Data":"0e52d297b02263672e81fd3169fd4b99a24513d45932a28777ead4417344b32d"} Apr 17 17:28:28.519228 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.519201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps" event={"ID":"59eaa364-77a3-4e8d-b694-75e3c4a185b8","Type":"ContainerStarted","Data":"dfd9fecba4aa35e49b24eeaa81124da8831123c2dd034d5c5c22354105af8191"} Apr 17 17:28:28.520164 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.520139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vz885" event={"ID":"32d852f3-1d88-49d6-93e3-d36b1a499102","Type":"ContainerStarted","Data":"007022b779fccb8adcc16dff5056344b569f39dd8b5f65d5a8e3763a4c8d3a0a"} Apr 17 17:28:28.521010 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:28.520989 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" event={"ID":"4ef3c872-c400-4d98-9028-72e95653a455","Type":"ContainerStarted","Data":"6e30cf450f5785d6b69a4b79cbe4cb59e3f854f7a9786061f0cc386ccc260d1e"} Apr 17 17:28:29.503890 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:29.503316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:29.503890 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:29.503467 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:28:29.503890 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:29.503532 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls podName:adab37b6-8e36-456e-ac19-5ae45ffa56bd nodeName:}" failed. No retries permitted until 2026-04-17 17:28:31.503511498 +0000 UTC m=+138.078859767 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bf8qb" (UID: "adab37b6-8e36-456e-ac19-5ae45ffa56bd") : secret "samples-operator-tls" not found Apr 17 17:28:30.085168 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:30.085129 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5"] Apr 17 17:28:30.088100 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:30.088067 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5" Apr 17 17:28:30.090797 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:30.090764 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-79zdt\"" Apr 17 17:28:30.095182 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:30.095139 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5"] Apr 17 17:28:30.209785 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:30.209748 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvgr9\" (UniqueName: \"kubernetes.io/projected/6417f091-7698-465c-978f-0eaed9974ca6-kube-api-access-dvgr9\") pod \"network-check-source-8894fc9bd-xpct5\" (UID: \"6417f091-7698-465c-978f-0eaed9974ca6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5" Apr 17 17:28:30.310500 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:30.310453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dvgr9\" (UniqueName: \"kubernetes.io/projected/6417f091-7698-465c-978f-0eaed9974ca6-kube-api-access-dvgr9\") pod \"network-check-source-8894fc9bd-xpct5\" (UID: \"6417f091-7698-465c-978f-0eaed9974ca6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5" Apr 17 17:28:30.320149 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:30.320119 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvgr9\" (UniqueName: \"kubernetes.io/projected/6417f091-7698-465c-978f-0eaed9974ca6-kube-api-access-dvgr9\") pod \"network-check-source-8894fc9bd-xpct5\" (UID: \"6417f091-7698-465c-978f-0eaed9974ca6\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5" Apr 17 17:28:30.399210 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:30.399139 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5" Apr 17 17:28:31.519498 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.519455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:31.519985 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:31.519604 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:28:31.519985 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:31.519647 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls podName:adab37b6-8e36-456e-ac19-5ae45ffa56bd nodeName:}" failed. No retries permitted until 2026-04-17 17:28:35.519633683 +0000 UTC m=+142.094981949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bf8qb" (UID: "adab37b6-8e36-456e-ac19-5ae45ffa56bd") : secret "samples-operator-tls" not found Apr 17 17:28:31.531241 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.531038 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vz885" event={"ID":"32d852f3-1d88-49d6-93e3-d36b1a499102","Type":"ContainerStarted","Data":"b4ee937f269bf069443137c2c14deebe25d14a1498a31fbdae67529ab5142119"} Apr 17 17:28:31.532575 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.532534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" event={"ID":"4ef3c872-c400-4d98-9028-72e95653a455","Type":"ContainerStarted","Data":"1087e54b941c08ad943da4c7c89854a0ca67b37de49cc1838f042e9d5bedaf41"} Apr 17 17:28:31.533904 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.533875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" event={"ID":"563dc28b-3cac-43af-adc2-31d0202d2905","Type":"ContainerStarted","Data":"f0c62bd78dc577674e0a4cd9545dc6b799da673c3580c26009c5e839459aa88d"} Apr 17 17:28:31.534357 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.534313 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:31.535548 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.535516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps" event={"ID":"59eaa364-77a3-4e8d-b694-75e3c4a185b8","Type":"ContainerStarted","Data":"78d595cd00043d481e7dc4e15f7a7ed47e5542fc708b9dc2c9ea7b77557d97a3"} Apr 17 17:28:31.535742 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.535713 2576 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-722rn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.132.0.11:8443/readyz\": dial tcp 10.132.0.11:8443: connect: connection refused" start-of-body= Apr 17 17:28:31.535801 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.535785 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" podUID="563dc28b-3cac-43af-adc2-31d0202d2905" containerName="console-operator" probeResult="failure" output="Get \"https://10.132.0.11:8443/readyz\": dial tcp 10.132.0.11:8443: connect: connection refused" Apr 17 17:28:31.541278 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.541193 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5"] Apr 17 17:28:31.546880 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:31.546844 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6417f091_7698_465c_978f_0eaed9974ca6.slice/crio-c61e26283801d4b0338ca98c7cf5dbde507aaf23bb93dfe2b31aa3e53e587f08 WatchSource:0}: Error finding container c61e26283801d4b0338ca98c7cf5dbde507aaf23bb93dfe2b31aa3e53e587f08: Status 404 returned error can't find the container with id c61e26283801d4b0338ca98c7cf5dbde507aaf23bb93dfe2b31aa3e53e587f08 Apr 17 17:28:31.552057 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.551954 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-vz885" podStartSLOduration=1.353772623 podStartE2EDuration="4.551935491s" podCreationTimestamp="2026-04-17 17:28:27 +0000 UTC" firstStartedPulling="2026-04-17 17:28:28.208560778 +0000 UTC m=+134.783909045" lastFinishedPulling="2026-04-17 17:28:31.406723635 +0000 UTC m=+137.982071913" observedRunningTime="2026-04-17 17:28:31.551273988 +0000 UTC m=+138.126622289" watchObservedRunningTime="2026-04-17 17:28:31.551935491 +0000 UTC m=+138.127283780" Apr 17 17:28:31.568393 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.566876 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-k7wps" podStartSLOduration=1.347325953 podStartE2EDuration="4.566843267s" podCreationTimestamp="2026-04-17 17:28:27 +0000 UTC" firstStartedPulling="2026-04-17 17:28:28.185392019 +0000 UTC m=+134.760740284" lastFinishedPulling="2026-04-17 17:28:31.404909316 +0000 UTC m=+137.980257598" observedRunningTime="2026-04-17 17:28:31.566561085 +0000 UTC m=+138.141909374" watchObservedRunningTime="2026-04-17 17:28:31.566843267 +0000 UTC m=+138.142191557" Apr 17 17:28:31.584331 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.584240 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" podStartSLOduration=1.404795007 podStartE2EDuration="4.584220718s" podCreationTimestamp="2026-04-17 17:28:27 +0000 UTC" firstStartedPulling="2026-04-17 17:28:28.226772165 +0000 UTC m=+134.802120436" lastFinishedPulling="2026-04-17 17:28:31.406197875 +0000 UTC m=+137.981546147" observedRunningTime="2026-04-17 17:28:31.583563879 +0000 UTC m=+138.158912170" watchObservedRunningTime="2026-04-17 17:28:31.584220718 +0000 UTC m=+138.159569007" Apr 17 17:28:31.603785 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:31.602748 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" podStartSLOduration=1.462904405 podStartE2EDuration="4.602729856s" podCreationTimestamp="2026-04-17 17:28:27 +0000 UTC" firstStartedPulling="2026-04-17 17:28:28.272958735 +0000 UTC m=+134.848307001" lastFinishedPulling="2026-04-17 17:28:31.412784186 +0000 UTC m=+137.988132452" observedRunningTime="2026-04-17 17:28:31.602379217 +0000 UTC m=+138.177727509" watchObservedRunningTime="2026-04-17 17:28:31.602729856 +0000 UTC m=+138.178078145" Apr 17 17:28:32.540008 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:32.539978 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/0.log" Apr 17 17:28:32.540465 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:32.540019 2576 generic.go:358] "Generic (PLEG): container finished" podID="563dc28b-3cac-43af-adc2-31d0202d2905" containerID="f0c62bd78dc577674e0a4cd9545dc6b799da673c3580c26009c5e839459aa88d" exitCode=255 Apr 17 17:28:32.540465 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:32.540121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" event={"ID":"563dc28b-3cac-43af-adc2-31d0202d2905","Type":"ContainerDied","Data":"f0c62bd78dc577674e0a4cd9545dc6b799da673c3580c26009c5e839459aa88d"} Apr 17 17:28:32.540465 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:32.540392 2576 scope.go:117] "RemoveContainer" containerID="f0c62bd78dc577674e0a4cd9545dc6b799da673c3580c26009c5e839459aa88d" Apr 17 17:28:32.541551 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:32.541524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5" event={"ID":"6417f091-7698-465c-978f-0eaed9974ca6","Type":"ContainerStarted","Data":"5049e34223bc9c9f1a1b2d31a4b87588323181abc8fdb75fd62d9abf47ff1eb1"} Apr 17 17:28:32.541736 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:32.541709 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5" event={"ID":"6417f091-7698-465c-978f-0eaed9974ca6","Type":"ContainerStarted","Data":"c61e26283801d4b0338ca98c7cf5dbde507aaf23bb93dfe2b31aa3e53e587f08"} Apr 17 17:28:32.578855 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:32.578802 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-xpct5" podStartSLOduration=2.578784884 podStartE2EDuration="2.578784884s" podCreationTimestamp="2026-04-17 17:28:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:28:32.57842647 +0000 UTC m=+139.153774758" watchObservedRunningTime="2026-04-17 17:28:32.578784884 +0000 UTC m=+139.154133172" Apr 17 17:28:33.545798 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.545772 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:28:33.546252 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.546150 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/0.log" Apr 17 17:28:33.546252 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.546182 2576 generic.go:358] "Generic (PLEG): container finished" podID="563dc28b-3cac-43af-adc2-31d0202d2905" containerID="f7c21766d9e3fcfea1219025680b580ce9f4b97b21e919c59cfc0312859c19db" exitCode=255 Apr 17 17:28:33.546252 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.546218 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" event={"ID":"563dc28b-3cac-43af-adc2-31d0202d2905","Type":"ContainerDied","Data":"f7c21766d9e3fcfea1219025680b580ce9f4b97b21e919c59cfc0312859c19db"} Apr 17 17:28:33.546352 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.546274 2576 scope.go:117] "RemoveContainer" containerID="f0c62bd78dc577674e0a4cd9545dc6b799da673c3580c26009c5e839459aa88d" Apr 17 17:28:33.546510 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.546488 2576 scope.go:117] "RemoveContainer" containerID="f7c21766d9e3fcfea1219025680b580ce9f4b97b21e919c59cfc0312859c19db" Apr 17 17:28:33.546734 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:33.546712 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-722rn_openshift-console-operator(563dc28b-3cac-43af-adc2-31d0202d2905)\"" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" podUID="563dc28b-3cac-43af-adc2-31d0202d2905" Apr 17 17:28:33.585281 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.585252 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jl6fl"] Apr 17 17:28:33.588428 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.588411 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.591374 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.591350 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 17:28:33.591563 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.591545 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 17:28:33.591691 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.591580 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 17:28:33.591691 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.591616 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 17:28:33.591815 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.591720 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-8dktd\"" Apr 17 17:28:33.596086 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.596064 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jl6fl"] Apr 17 17:28:33.690078 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.690039 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b"] Apr 17 17:28:33.692945 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.692926 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b" Apr 17 17:28:33.695880 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.695860 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 17:28:33.695880 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.695872 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-fppct\"" Apr 17 17:28:33.696074 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.695965 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 17:28:33.700603 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.700579 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b"] Apr 17 17:28:33.736875 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.736836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/686847f9-7d2b-45a1-82dc-ff6f671e7657-signing-key\") pod \"service-ca-865cb79987-jl6fl\" (UID: \"686847f9-7d2b-45a1-82dc-ff6f671e7657\") " pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.737096 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.736940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msjcj\" (UniqueName: \"kubernetes.io/projected/686847f9-7d2b-45a1-82dc-ff6f671e7657-kube-api-access-msjcj\") pod \"service-ca-865cb79987-jl6fl\" (UID: \"686847f9-7d2b-45a1-82dc-ff6f671e7657\") " pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.737096 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.737043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/686847f9-7d2b-45a1-82dc-ff6f671e7657-signing-cabundle\") pod \"service-ca-865cb79987-jl6fl\" (UID: \"686847f9-7d2b-45a1-82dc-ff6f671e7657\") " pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.838313 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.838239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msjcj\" (UniqueName: \"kubernetes.io/projected/686847f9-7d2b-45a1-82dc-ff6f671e7657-kube-api-access-msjcj\") pod \"service-ca-865cb79987-jl6fl\" (UID: \"686847f9-7d2b-45a1-82dc-ff6f671e7657\") " pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.838313 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.838288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/686847f9-7d2b-45a1-82dc-ff6f671e7657-signing-cabundle\") pod \"service-ca-865cb79987-jl6fl\" (UID: \"686847f9-7d2b-45a1-82dc-ff6f671e7657\") " pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.838509 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.838484 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/686847f9-7d2b-45a1-82dc-ff6f671e7657-signing-key\") pod \"service-ca-865cb79987-jl6fl\" (UID: \"686847f9-7d2b-45a1-82dc-ff6f671e7657\") " pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.838565 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.838550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hhz5\" (UniqueName: \"kubernetes.io/projected/32e7e58d-4b83-4c34-9bc6-023ef74bde5d-kube-api-access-7hhz5\") pod \"migrator-74bb7799d9-hkj2b\" (UID: \"32e7e58d-4b83-4c34-9bc6-023ef74bde5d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b" Apr 17 17:28:33.838858 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.838836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/686847f9-7d2b-45a1-82dc-ff6f671e7657-signing-cabundle\") pod \"service-ca-865cb79987-jl6fl\" (UID: \"686847f9-7d2b-45a1-82dc-ff6f671e7657\") " pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.840820 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.840798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/686847f9-7d2b-45a1-82dc-ff6f671e7657-signing-key\") pod \"service-ca-865cb79987-jl6fl\" (UID: \"686847f9-7d2b-45a1-82dc-ff6f671e7657\") " pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.847875 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.847852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msjcj\" (UniqueName: \"kubernetes.io/projected/686847f9-7d2b-45a1-82dc-ff6f671e7657-kube-api-access-msjcj\") pod \"service-ca-865cb79987-jl6fl\" (UID: \"686847f9-7d2b-45a1-82dc-ff6f671e7657\") " pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.897395 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.897360 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jl6fl" Apr 17 17:28:33.939449 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.939343 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hhz5\" (UniqueName: \"kubernetes.io/projected/32e7e58d-4b83-4c34-9bc6-023ef74bde5d-kube-api-access-7hhz5\") pod \"migrator-74bb7799d9-hkj2b\" (UID: \"32e7e58d-4b83-4c34-9bc6-023ef74bde5d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b" Apr 17 17:28:33.947582 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:33.947549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hhz5\" (UniqueName: \"kubernetes.io/projected/32e7e58d-4b83-4c34-9bc6-023ef74bde5d-kube-api-access-7hhz5\") pod \"migrator-74bb7799d9-hkj2b\" (UID: \"32e7e58d-4b83-4c34-9bc6-023ef74bde5d\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b" Apr 17 17:28:34.002381 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:34.002343 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b" Apr 17 17:28:34.018065 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:34.018012 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jl6fl"] Apr 17 17:28:34.020532 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:34.020498 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod686847f9_7d2b_45a1_82dc_ff6f671e7657.slice/crio-3d65cfac554e300a10e40484632a4cd8d3e1a790830d8d5b3bf38a0f03860d2e WatchSource:0}: Error finding container 3d65cfac554e300a10e40484632a4cd8d3e1a790830d8d5b3bf38a0f03860d2e: Status 404 returned error can't find the container with id 3d65cfac554e300a10e40484632a4cd8d3e1a790830d8d5b3bf38a0f03860d2e Apr 17 17:28:34.120008 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:34.119923 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b"] Apr 17 17:28:34.122587 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:34.122557 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32e7e58d_4b83_4c34_9bc6_023ef74bde5d.slice/crio-e7dd66d2f961bb6ead97c28a643d177d72e46f7fd9ad9863079ccbd68a810e7f WatchSource:0}: Error finding container e7dd66d2f961bb6ead97c28a643d177d72e46f7fd9ad9863079ccbd68a810e7f: Status 404 returned error can't find the container with id e7dd66d2f961bb6ead97c28a643d177d72e46f7fd9ad9863079ccbd68a810e7f Apr 17 17:28:34.550954 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:34.550923 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:28:34.551438 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:34.551355 2576 scope.go:117] "RemoveContainer" containerID="f7c21766d9e3fcfea1219025680b580ce9f4b97b21e919c59cfc0312859c19db" Apr 17 17:28:34.551606 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:34.551577 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-722rn_openshift-console-operator(563dc28b-3cac-43af-adc2-31d0202d2905)\"" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" podUID="563dc28b-3cac-43af-adc2-31d0202d2905" Apr 17 17:28:34.552402 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:34.552368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jl6fl" event={"ID":"686847f9-7d2b-45a1-82dc-ff6f671e7657","Type":"ContainerStarted","Data":"3d65cfac554e300a10e40484632a4cd8d3e1a790830d8d5b3bf38a0f03860d2e"} Apr 17 17:28:34.553423 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:34.553398 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b" event={"ID":"32e7e58d-4b83-4c34-9bc6-023ef74bde5d","Type":"ContainerStarted","Data":"e7dd66d2f961bb6ead97c28a643d177d72e46f7fd9ad9863079ccbd68a810e7f"} Apr 17 17:28:35.263582 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:35.263554 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lrfkm_c578d9f0-5deb-4aba-b916-5f7c6bec807d/dns-node-resolver/0.log" Apr 17 17:28:35.554437 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:35.554344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:35.554854 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:35.554464 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 17 17:28:35.554854 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:35.554538 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls podName:adab37b6-8e36-456e-ac19-5ae45ffa56bd nodeName:}" failed. No retries permitted until 2026-04-17 17:28:43.554516869 +0000 UTC m=+150.129865139 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bf8qb" (UID: "adab37b6-8e36-456e-ac19-5ae45ffa56bd") : secret "samples-operator-tls" not found Apr 17 17:28:36.263468 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:36.263440 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wxqd8_141084a6-6657-4416-a7f1-21339dfd8b0a/node-ca/0.log" Apr 17 17:28:36.559900 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:36.559799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b" event={"ID":"32e7e58d-4b83-4c34-9bc6-023ef74bde5d","Type":"ContainerStarted","Data":"c3935200752cf332a189ee6c3f98721727aff82916aa0835ba47c071c2914faa"} Apr 17 17:28:36.559900 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:36.559836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b" event={"ID":"32e7e58d-4b83-4c34-9bc6-023ef74bde5d","Type":"ContainerStarted","Data":"8085c2653a9185e2e67a69a2c363d40bf039c2bc7849104edf344055618557ee"} Apr 17 17:28:36.561008 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:36.560984 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jl6fl" event={"ID":"686847f9-7d2b-45a1-82dc-ff6f671e7657","Type":"ContainerStarted","Data":"abb6e89e1fe276f3258e8f32dac0d87b27aeae1a918435b8814bb8a70a9e7b48"} Apr 17 17:28:36.593347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:36.593293 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-jl6fl" podStartSLOduration=1.847565466 podStartE2EDuration="3.593279468s" podCreationTimestamp="2026-04-17 17:28:33 +0000 UTC" firstStartedPulling="2026-04-17 17:28:34.022446686 +0000 UTC m=+140.597794955" lastFinishedPulling="2026-04-17 17:28:35.76816069 +0000 UTC m=+142.343508957" observedRunningTime="2026-04-17 17:28:36.59221218 +0000 UTC m=+143.167560463" watchObservedRunningTime="2026-04-17 17:28:36.593279468 +0000 UTC m=+143.168627756" Apr 17 17:28:36.593517 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:36.593367 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hkj2b" podStartSLOduration=1.9548904569999999 podStartE2EDuration="3.593361629s" podCreationTimestamp="2026-04-17 17:28:33 +0000 UTC" firstStartedPulling="2026-04-17 17:28:34.124330524 +0000 UTC m=+140.699678790" lastFinishedPulling="2026-04-17 17:28:35.762801693 +0000 UTC m=+142.338149962" observedRunningTime="2026-04-17 17:28:36.576866261 +0000 UTC m=+143.152214550" watchObservedRunningTime="2026-04-17 17:28:36.593361629 +0000 UTC m=+143.168709916" Apr 17 17:28:38.053816 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:38.053778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:38.054306 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:38.054197 2576 scope.go:117] "RemoveContainer" containerID="f7c21766d9e3fcfea1219025680b580ce9f4b97b21e919c59cfc0312859c19db" Apr 17 17:28:38.054385 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:38.054365 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-722rn_openshift-console-operator(563dc28b-3cac-43af-adc2-31d0202d2905)\"" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" podUID="563dc28b-3cac-43af-adc2-31d0202d2905" Apr 17 17:28:41.534983 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:41.534937 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:41.535370 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:41.535355 2576 scope.go:117] "RemoveContainer" containerID="f7c21766d9e3fcfea1219025680b580ce9f4b97b21e919c59cfc0312859c19db" Apr 17 17:28:41.535535 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:41.535518 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-722rn_openshift-console-operator(563dc28b-3cac-43af-adc2-31d0202d2905)\"" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" podUID="563dc28b-3cac-43af-adc2-31d0202d2905" Apr 17 17:28:43.619789 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:43.619745 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:43.622236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:43.622204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/adab37b6-8e36-456e-ac19-5ae45ffa56bd-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bf8qb\" (UID: \"adab37b6-8e36-456e-ac19-5ae45ffa56bd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:43.640212 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:43.640181 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" Apr 17 17:28:43.758698 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:43.758667 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb"] Apr 17 17:28:44.583647 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:44.583604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" event={"ID":"adab37b6-8e36-456e-ac19-5ae45ffa56bd","Type":"ContainerStarted","Data":"1c064274cb81a8a4b03fdf7888bfe36af77267bc82cba132380db13d36c92052"} Apr 17 17:28:45.587756 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:45.587724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" event={"ID":"adab37b6-8e36-456e-ac19-5ae45ffa56bd","Type":"ContainerStarted","Data":"c12f8f86ffc26317b437f5f876f87f56c87d3476de8dbdcde08a0ebb4ee81502"} Apr 17 17:28:45.588127 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:45.587763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" event={"ID":"adab37b6-8e36-456e-ac19-5ae45ffa56bd","Type":"ContainerStarted","Data":"ade880430e3750d83c312f7b5f546ac279766b58cf495e51f253252e4a3276db"} Apr 17 17:28:45.609297 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:45.609232 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bf8qb" podStartSLOduration=16.953448961 podStartE2EDuration="18.609213286s" podCreationTimestamp="2026-04-17 17:28:27 +0000 UTC" firstStartedPulling="2026-04-17 17:28:43.808519033 +0000 UTC m=+150.383867304" lastFinishedPulling="2026-04-17 17:28:45.46428336 +0000 UTC m=+152.039631629" observedRunningTime="2026-04-17 17:28:45.608217303 +0000 UTC m=+152.183565595" watchObservedRunningTime="2026-04-17 17:28:45.609213286 +0000 UTC m=+152.184561576" Apr 17 17:28:48.808380 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:48.808327 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" podUID="860ed815-44b1-4158-af77-fb201acf8cbf" Apr 17 17:28:48.822610 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:48.822572 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cgfbj" podUID="9e4659f9-8397-426b-a4db-39edf813f27a" Apr 17 17:28:48.837757 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:48.837725 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-smpcv" podUID="e0d9b600-a7c0-458a-b964-3feb2fe753c5" Apr 17 17:28:49.009798 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:28:49.009751 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-l28wh" podUID="c56ede72-4e1e-4a75-9ebe-eabfdfcd2065" Apr 17 17:28:49.599421 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:49.599386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:28:49.599597 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:49.599386 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:28:53.685956 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.685905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:28:53.688314 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.688292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"image-registry-68866946cf-4l6t4\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:28:53.787310 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.787262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:28:53.787477 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.787363 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:28:53.789713 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.789677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0d9b600-a7c0-458a-b964-3feb2fe753c5-metrics-tls\") pod \"dns-default-smpcv\" (UID: \"e0d9b600-a7c0-458a-b964-3feb2fe753c5\") " pod="openshift-dns/dns-default-smpcv" Apr 17 17:28:53.789815 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.789744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e4659f9-8397-426b-a4db-39edf813f27a-cert\") pod \"ingress-canary-cgfbj\" (UID: \"9e4659f9-8397-426b-a4db-39edf813f27a\") " pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:28:53.804376 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.804354 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-pcbgd\"" Apr 17 17:28:53.804376 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.804369 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-29fk4\"" Apr 17 17:28:53.811098 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.811081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:28:53.811174 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.811155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cgfbj" Apr 17 17:28:53.945107 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.945007 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68866946cf-4l6t4"] Apr 17 17:28:53.947402 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:53.947377 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860ed815_44b1_4158_af77_fb201acf8cbf.slice/crio-655e75aeb6f944fdbce081ed9f4a329ccaa9eddb2bd90dad4c02b0302e195038 WatchSource:0}: Error finding container 655e75aeb6f944fdbce081ed9f4a329ccaa9eddb2bd90dad4c02b0302e195038: Status 404 returned error can't find the container with id 655e75aeb6f944fdbce081ed9f4a329ccaa9eddb2bd90dad4c02b0302e195038 Apr 17 17:28:53.965980 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:53.965960 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cgfbj"] Apr 17 17:28:53.968842 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:53.968817 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e4659f9_8397_426b_a4db_39edf813f27a.slice/crio-e1597742646cc9facefb80dd4d64d7527552848165026525469c69dfc4cd14ae WatchSource:0}: Error finding container e1597742646cc9facefb80dd4d64d7527552848165026525469c69dfc4cd14ae: Status 404 returned error can't find the container with id e1597742646cc9facefb80dd4d64d7527552848165026525469c69dfc4cd14ae Apr 17 17:28:54.058545 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.058510 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-bqbpv"] Apr 17 17:28:54.062337 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.062309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.065796 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.065763 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:28:54.065948 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.065798 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jslpj\"" Apr 17 17:28:54.065948 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.065813 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:28:54.076034 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.075961 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bqbpv"] Apr 17 17:28:54.089755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.089719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e7ce222f-b049-4136-aa1a-e11fe5ae538b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.089942 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.089816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e7ce222f-b049-4136-aa1a-e11fe5ae538b-crio-socket\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.089942 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.089877 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e7ce222f-b049-4136-aa1a-e11fe5ae538b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.089942 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.089908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6bgz\" (UniqueName: \"kubernetes.io/projected/e7ce222f-b049-4136-aa1a-e11fe5ae538b-kube-api-access-x6bgz\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.090139 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.090079 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7ce222f-b049-4136-aa1a-e11fe5ae538b-data-volume\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.190441 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.190409 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e7ce222f-b049-4136-aa1a-e11fe5ae538b-crio-socket\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.190636 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.190459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e7ce222f-b049-4136-aa1a-e11fe5ae538b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.190636 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.190477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6bgz\" (UniqueName: \"kubernetes.io/projected/e7ce222f-b049-4136-aa1a-e11fe5ae538b-kube-api-access-x6bgz\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.190636 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.190506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7ce222f-b049-4136-aa1a-e11fe5ae538b-data-volume\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.190636 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.190529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e7ce222f-b049-4136-aa1a-e11fe5ae538b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.190636 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.190552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e7ce222f-b049-4136-aa1a-e11fe5ae538b-crio-socket\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.190904 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.190879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7ce222f-b049-4136-aa1a-e11fe5ae538b-data-volume\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.191010 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.190985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e7ce222f-b049-4136-aa1a-e11fe5ae538b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.192840 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.192820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e7ce222f-b049-4136-aa1a-e11fe5ae538b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.211006 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.210936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6bgz\" (UniqueName: \"kubernetes.io/projected/e7ce222f-b049-4136-aa1a-e11fe5ae538b-kube-api-access-x6bgz\") pod \"insights-runtime-extractor-bqbpv\" (UID: \"e7ce222f-b049-4136-aa1a-e11fe5ae538b\") " pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.373572 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.373538 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-bqbpv" Apr 17 17:28:54.504128 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.504084 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-bqbpv"] Apr 17 17:28:54.507965 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:54.507932 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7ce222f_b049_4136_aa1a_e11fe5ae538b.slice/crio-3325af86e1646008931b0079bcadbc0573a268a7cad10e514b5728153ba9287c WatchSource:0}: Error finding container 3325af86e1646008931b0079bcadbc0573a268a7cad10e514b5728153ba9287c: Status 404 returned error can't find the container with id 3325af86e1646008931b0079bcadbc0573a268a7cad10e514b5728153ba9287c Apr 17 17:28:54.614240 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.614201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bqbpv" event={"ID":"e7ce222f-b049-4136-aa1a-e11fe5ae538b","Type":"ContainerStarted","Data":"a914fda3343bc86e2cbd33de51961451eb94ea22635b44a81c2d785abea4f621"} Apr 17 17:28:54.614240 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.614247 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bqbpv" event={"ID":"e7ce222f-b049-4136-aa1a-e11fe5ae538b","Type":"ContainerStarted","Data":"3325af86e1646008931b0079bcadbc0573a268a7cad10e514b5728153ba9287c"} Apr 17 17:28:54.615774 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.615742 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" event={"ID":"860ed815-44b1-4158-af77-fb201acf8cbf","Type":"ContainerStarted","Data":"498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3"} Apr 17 17:28:54.615920 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.615779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" event={"ID":"860ed815-44b1-4158-af77-fb201acf8cbf","Type":"ContainerStarted","Data":"655e75aeb6f944fdbce081ed9f4a329ccaa9eddb2bd90dad4c02b0302e195038"} Apr 17 17:28:54.615920 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.615850 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:28:54.616962 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.616936 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cgfbj" event={"ID":"9e4659f9-8397-426b-a4db-39edf813f27a","Type":"ContainerStarted","Data":"e1597742646cc9facefb80dd4d64d7527552848165026525469c69dfc4cd14ae"} Apr 17 17:28:54.636596 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:54.636545 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" podStartSLOduration=160.636526407 podStartE2EDuration="2m40.636526407s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:28:54.635619971 +0000 UTC m=+161.210968259" watchObservedRunningTime="2026-04-17 17:28:54.636526407 +0000 UTC m=+161.211874701" Apr 17 17:28:55.997233 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:55.997149 2576 scope.go:117] "RemoveContainer" containerID="f7c21766d9e3fcfea1219025680b580ce9f4b97b21e919c59cfc0312859c19db" Apr 17 17:28:56.624644 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:56.624611 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bqbpv" event={"ID":"e7ce222f-b049-4136-aa1a-e11fe5ae538b","Type":"ContainerStarted","Data":"c5e4cdba63c855ab72bebcda8c9776788885539ce6e7d76d1113794dd866ab7e"} Apr 17 17:28:56.626144 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:56.626114 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cgfbj" event={"ID":"9e4659f9-8397-426b-a4db-39edf813f27a","Type":"ContainerStarted","Data":"a434ad352b5abeef7d85964a7b07bcef9d6375f38f11515d54832534b367a4d1"} Apr 17 17:28:56.627802 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:56.627782 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:28:56.627897 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:56.627844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" event={"ID":"563dc28b-3cac-43af-adc2-31d0202d2905","Type":"ContainerStarted","Data":"1aa081f49cf7377efff278005a2a7670f353e1cc4d4ce2a23b75e27d81f5a4a6"} Apr 17 17:28:56.628148 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:56.628126 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:56.661564 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:56.661505 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cgfbj" podStartSLOduration=129.974738815 podStartE2EDuration="2m11.661485462s" podCreationTimestamp="2026-04-17 17:26:45 +0000 UTC" firstStartedPulling="2026-04-17 17:28:53.970765257 +0000 UTC m=+160.546113528" lastFinishedPulling="2026-04-17 17:28:55.657511906 +0000 UTC m=+162.232860175" observedRunningTime="2026-04-17 17:28:56.643878541 +0000 UTC m=+163.219226831" watchObservedRunningTime="2026-04-17 17:28:56.661485462 +0000 UTC m=+163.236833751" Apr 17 17:28:56.900826 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:56.900747 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-722rn" Apr 17 17:28:57.080720 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.080688 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-7dsfs"] Apr 17 17:28:57.083705 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.083681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7dsfs" Apr 17 17:28:57.086233 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.086206 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:28:57.086364 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.086236 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-bs9z8\"" Apr 17 17:28:57.086435 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.086361 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:28:57.093584 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.093557 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7dsfs"] Apr 17 17:28:57.115144 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.115110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56vzd\" (UniqueName: \"kubernetes.io/projected/57a4d511-24de-4e96-ab0c-2099602b447f-kube-api-access-56vzd\") pod \"downloads-6bcc868b7-7dsfs\" (UID: \"57a4d511-24de-4e96-ab0c-2099602b447f\") " pod="openshift-console/downloads-6bcc868b7-7dsfs" Apr 17 17:28:57.216159 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.216071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56vzd\" (UniqueName: \"kubernetes.io/projected/57a4d511-24de-4e96-ab0c-2099602b447f-kube-api-access-56vzd\") pod \"downloads-6bcc868b7-7dsfs\" (UID: \"57a4d511-24de-4e96-ab0c-2099602b447f\") " pod="openshift-console/downloads-6bcc868b7-7dsfs" Apr 17 17:28:57.228039 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.227995 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56vzd\" (UniqueName: \"kubernetes.io/projected/57a4d511-24de-4e96-ab0c-2099602b447f-kube-api-access-56vzd\") pod \"downloads-6bcc868b7-7dsfs\" (UID: \"57a4d511-24de-4e96-ab0c-2099602b447f\") " pod="openshift-console/downloads-6bcc868b7-7dsfs" Apr 17 17:28:57.392717 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.392675 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-7dsfs" Apr 17 17:28:57.512574 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.512543 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-7dsfs"] Apr 17 17:28:57.515876 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:28:57.515845 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a4d511_24de_4e96_ab0c_2099602b447f.slice/crio-b9a83ef49f569e6b139116e46002479ce9a91a3028a3468859d0da84885e5574 WatchSource:0}: Error finding container b9a83ef49f569e6b139116e46002479ce9a91a3028a3468859d0da84885e5574: Status 404 returned error can't find the container with id b9a83ef49f569e6b139116e46002479ce9a91a3028a3468859d0da84885e5574 Apr 17 17:28:57.632315 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.632276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-bqbpv" event={"ID":"e7ce222f-b049-4136-aa1a-e11fe5ae538b","Type":"ContainerStarted","Data":"f57573be24f92dee36cb341e872c9851d2a56945b5c761da8c55440fd51f6b88"} Apr 17 17:28:57.633311 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.633275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7dsfs" event={"ID":"57a4d511-24de-4e96-ab0c-2099602b447f","Type":"ContainerStarted","Data":"b9a83ef49f569e6b139116e46002479ce9a91a3028a3468859d0da84885e5574"} Apr 17 17:28:57.654338 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:28:57.654281 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-bqbpv" podStartSLOduration=1.209276674 podStartE2EDuration="3.654266022s" podCreationTimestamp="2026-04-17 17:28:54 +0000 UTC" firstStartedPulling="2026-04-17 17:28:54.565978733 +0000 UTC m=+161.141326998" lastFinishedPulling="2026-04-17 17:28:57.010968066 +0000 UTC m=+163.586316346" observedRunningTime="2026-04-17 17:28:57.653890374 +0000 UTC m=+164.229238663" watchObservedRunningTime="2026-04-17 17:28:57.654266022 +0000 UTC m=+164.229614310" Apr 17 17:29:01.997106 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:01.997007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-smpcv" Apr 17 17:29:01.997528 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:01.997007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:29:02.000093 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:02.000069 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dxpw7\"" Apr 17 17:29:02.008366 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:02.008342 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-smpcv" Apr 17 17:29:02.154377 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:02.154337 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-smpcv"] Apr 17 17:29:02.157687 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:29:02.157648 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d9b600_a7c0_458a_b964_3feb2fe753c5.slice/crio-2c0947c5cef2e03e387fd965c6149eb1a7b9bc56b66572a2c196e06045a213d9 WatchSource:0}: Error finding container 2c0947c5cef2e03e387fd965c6149eb1a7b9bc56b66572a2c196e06045a213d9: Status 404 returned error can't find the container with id 2c0947c5cef2e03e387fd965c6149eb1a7b9bc56b66572a2c196e06045a213d9 Apr 17 17:29:02.648751 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:02.648710 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-smpcv" event={"ID":"e0d9b600-a7c0-458a-b964-3feb2fe753c5","Type":"ContainerStarted","Data":"2c0947c5cef2e03e387fd965c6149eb1a7b9bc56b66572a2c196e06045a213d9"} Apr 17 17:29:04.655757 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:04.655718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-smpcv" event={"ID":"e0d9b600-a7c0-458a-b964-3feb2fe753c5","Type":"ContainerStarted","Data":"fb50a932f9b79425bc2e18be887a61d1627cec57234a213adf4f8d3a3b138d44"} Apr 17 17:29:04.655757 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:04.655762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-smpcv" event={"ID":"e0d9b600-a7c0-458a-b964-3feb2fe753c5","Type":"ContainerStarted","Data":"4a725798e53e91e9535a858950598c1eee92cae7d1f45f2eb3753337c85eef9f"} Apr 17 17:29:04.656308 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:04.655793 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-smpcv" Apr 17 17:29:04.675500 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:04.675442 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-smpcv" podStartSLOduration=138.050541269 podStartE2EDuration="2m19.675422577s" podCreationTimestamp="2026-04-17 17:26:45 +0000 UTC" firstStartedPulling="2026-04-17 17:29:02.159987433 +0000 UTC m=+168.735335700" lastFinishedPulling="2026-04-17 17:29:03.784868731 +0000 UTC m=+170.360217008" observedRunningTime="2026-04-17 17:29:04.675258889 +0000 UTC m=+171.250607178" watchObservedRunningTime="2026-04-17 17:29:04.675422577 +0000 UTC m=+171.250770868" Apr 17 17:29:07.602301 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.602261 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fbbc4dd79-ph4hv"] Apr 17 17:29:07.629636 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.629607 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fbbc4dd79-ph4hv"] Apr 17 17:29:07.629801 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.629736 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.634154 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.634123 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-j2tm2\"" Apr 17 17:29:07.634313 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.634150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:29:07.634313 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.634169 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:29:07.634313 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.634242 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:29:07.634527 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.634507 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:29:07.634655 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.634555 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:29:07.712666 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.712621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-serving-cert\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.712666 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.712662 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpdx\" (UniqueName: \"kubernetes.io/projected/a8db86cb-e997-48e4-86f8-4c182678a152-kube-api-access-twpdx\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.712932 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.712688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-oauth-serving-cert\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.712932 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.712759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-oauth-config\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.712932 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.712837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-service-ca\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.712932 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.712901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-console-config\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.813624 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.813582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-oauth-config\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.813809 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.813642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-service-ca\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.813809 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.813687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-console-config\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.813809 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.813748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-serving-cert\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.813809 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.813774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twpdx\" (UniqueName: \"kubernetes.io/projected/a8db86cb-e997-48e4-86f8-4c182678a152-kube-api-access-twpdx\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.813809 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.813807 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-oauth-serving-cert\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.814667 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.814620 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-console-config\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.814895 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.814872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-oauth-serving-cert\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.814958 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.814935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-service-ca\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.816567 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.816543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-oauth-config\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.817116 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.817098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-serving-cert\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.822389 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.822362 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpdx\" (UniqueName: \"kubernetes.io/projected/a8db86cb-e997-48e4-86f8-4c182678a152-kube-api-access-twpdx\") pod \"console-5fbbc4dd79-ph4hv\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:07.941625 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:07.941527 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:11.610368 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.610018 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zwbtc"] Apr 17 17:29:11.614318 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.614290 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.617566 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.617535 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:29:11.617965 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.617941 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:29:11.618192 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.618172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:29:11.618324 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.618179 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:29:11.619010 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.618731 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:29:11.619010 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.618809 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2tjqb\"" Apr 17 17:29:11.619010 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.618819 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:29:11.663158 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.663119 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tf5x4"] Apr 17 17:29:11.665881 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.665857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.669292 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.669268 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 17:29:11.669292 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.669282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:29:11.669994 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.669740 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 17:29:11.670185 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.669854 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-zjjlr\"" Apr 17 17:29:11.687321 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.682624 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tf5x4"] Apr 17 17:29:11.750622 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-tls\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.750622 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750619 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4hzz\" (UniqueName: \"kubernetes.io/projected/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-api-access-w4hzz\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.750868 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750670 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.750868 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.750868 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.750868 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750756 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-textfile\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.750868 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750792 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a3865d5e-8fda-423f-8730-6bbaa0c85e25-metrics-client-ca\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.750868 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qmxd\" (UniqueName: \"kubernetes.io/projected/a3865d5e-8fda-423f-8730-6bbaa0c85e25-kube-api-access-9qmxd\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.751196 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-accelerators-collector-config\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.751196 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.751196 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750924 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a3865d5e-8fda-423f-8730-6bbaa0c85e25-root\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.751196 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/93f4c3f7-38c9-4335-a216-1b515e0a943a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.751196 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.750983 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3865d5e-8fda-423f-8730-6bbaa0c85e25-sys\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.751196 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.751005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-wtmp\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.751196 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.751039 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93f4c3f7-38c9-4335-a216-1b515e0a943a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.852405 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93f4c3f7-38c9-4335-a216-1b515e0a943a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.852616 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852419 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-tls\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.852616 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852448 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4hzz\" (UniqueName: \"kubernetes.io/projected/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-api-access-w4hzz\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.852616 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.852616 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.852616 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.852616 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-textfile\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.852616 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:29:11.852582 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:29:11.852616 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852609 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a3865d5e-8fda-423f-8730-6bbaa0c85e25-metrics-client-ca\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853015 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:29:11.852659 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-tls podName:a3865d5e-8fda-423f-8730-6bbaa0c85e25 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:12.352635151 +0000 UTC m=+178.927983432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-tls") pod "node-exporter-zwbtc" (UID: "a3865d5e-8fda-423f-8730-6bbaa0c85e25") : secret "node-exporter-tls" not found Apr 17 17:29:11.853015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qmxd\" (UniqueName: \"kubernetes.io/projected/a3865d5e-8fda-423f-8730-6bbaa0c85e25-kube-api-access-9qmxd\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-accelerators-collector-config\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.853015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a3865d5e-8fda-423f-8730-6bbaa0c85e25-root\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/93f4c3f7-38c9-4335-a216-1b515e0a943a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.853015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3865d5e-8fda-423f-8730-6bbaa0c85e25-sys\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.852932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-wtmp\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853450 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.853106 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-wtmp\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853450 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.853181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3865d5e-8fda-423f-8730-6bbaa0c85e25-sys\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853450 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.853197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93f4c3f7-38c9-4335-a216-1b515e0a943a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.853600 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.853540 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a3865d5e-8fda-423f-8730-6bbaa0c85e25-root\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853922 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.853746 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a3865d5e-8fda-423f-8730-6bbaa0c85e25-metrics-client-ca\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853922 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.853748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-accelerators-collector-config\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.853922 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.853750 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.853922 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.853804 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-textfile\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.854212 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.853980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/93f4c3f7-38c9-4335-a216-1b515e0a943a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.856175 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.856148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.856293 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.856150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.856360 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.856317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.864817 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.864725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4hzz\" (UniqueName: \"kubernetes.io/projected/93f4c3f7-38c9-4335-a216-1b515e0a943a-kube-api-access-w4hzz\") pod \"kube-state-metrics-69db897b98-tf5x4\" (UID: \"93f4c3f7-38c9-4335-a216-1b515e0a943a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:11.864976 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.864841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qmxd\" (UniqueName: \"kubernetes.io/projected/a3865d5e-8fda-423f-8730-6bbaa0c85e25-kube-api-access-9qmxd\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:11.978829 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:11.978786 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" Apr 17 17:29:12.357952 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:12.357872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-tls\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:12.360647 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:12.360612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a3865d5e-8fda-423f-8730-6bbaa0c85e25-node-exporter-tls\") pod \"node-exporter-zwbtc\" (UID: \"a3865d5e-8fda-423f-8730-6bbaa0c85e25\") " pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:12.527278 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:12.527242 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zwbtc" Apr 17 17:29:13.815546 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:13.815506 2576 patch_prober.go:28] interesting pod/image-registry-68866946cf-4l6t4 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 17:29:13.815985 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:13.815571 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" podUID="860ed815-44b1-4158-af77-fb201acf8cbf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:29:14.252358 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:29:14.252243 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3865d5e_8fda_423f_8730_6bbaa0c85e25.slice/crio-1e56d9b0ef987ec40bb1659f5856c3f9eb1773ce82c171c30b702a8c1b2617d2 WatchSource:0}: Error finding container 1e56d9b0ef987ec40bb1659f5856c3f9eb1773ce82c171c30b702a8c1b2617d2: Status 404 returned error can't find the container with id 1e56d9b0ef987ec40bb1659f5856c3f9eb1773ce82c171c30b702a8c1b2617d2 Apr 17 17:29:14.373429 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.373403 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tf5x4"] Apr 17 17:29:14.375890 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:29:14.375860 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93f4c3f7_38c9_4335_a216_1b515e0a943a.slice/crio-41d7fc15a077b04ffe0a170e59e5b673509121379dffded7da51cde2b9a6e7bb WatchSource:0}: Error finding container 41d7fc15a077b04ffe0a170e59e5b673509121379dffded7da51cde2b9a6e7bb: Status 404 returned error can't find the container with id 41d7fc15a077b04ffe0a170e59e5b673509121379dffded7da51cde2b9a6e7bb Apr 17 17:29:14.391278 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.391242 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fbbc4dd79-ph4hv"] Apr 17 17:29:14.394791 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:29:14.394761 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8db86cb_e997_48e4_86f8_4c182678a152.slice/crio-057c466efc8ac532bfa9fa22cb173c0db528171703b5a9f4aebb794f1fa0cd43 WatchSource:0}: Error finding container 057c466efc8ac532bfa9fa22cb173c0db528171703b5a9f4aebb794f1fa0cd43: Status 404 returned error can't find the container with id 057c466efc8ac532bfa9fa22cb173c0db528171703b5a9f4aebb794f1fa0cd43 Apr 17 17:29:14.661185 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.660744 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-smpcv" Apr 17 17:29:14.699803 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.698569 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zwbtc" event={"ID":"a3865d5e-8fda-423f-8730-6bbaa0c85e25","Type":"ContainerStarted","Data":"1e56d9b0ef987ec40bb1659f5856c3f9eb1773ce82c171c30b702a8c1b2617d2"} Apr 17 17:29:14.709900 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.709833 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" event={"ID":"93f4c3f7-38c9-4335-a216-1b515e0a943a","Type":"ContainerStarted","Data":"41d7fc15a077b04ffe0a170e59e5b673509121379dffded7da51cde2b9a6e7bb"} Apr 17 17:29:14.712760 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.712718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-7dsfs" event={"ID":"57a4d511-24de-4e96-ab0c-2099602b447f","Type":"ContainerStarted","Data":"eea665e3150301d6c6f66afb8af99c4a681e12bdace2ceda8cff1e6b11a4cf6e"} Apr 17 17:29:14.714170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.713867 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-7dsfs" Apr 17 17:29:14.715291 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.714929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fbbc4dd79-ph4hv" event={"ID":"a8db86cb-e997-48e4-86f8-4c182678a152","Type":"ContainerStarted","Data":"057c466efc8ac532bfa9fa22cb173c0db528171703b5a9f4aebb794f1fa0cd43"} Apr 17 17:29:14.725550 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.725486 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-7dsfs" Apr 17 17:29:14.733689 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:14.733631 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-7dsfs" podStartSLOduration=0.940407357 podStartE2EDuration="17.733612455s" podCreationTimestamp="2026-04-17 17:28:57 +0000 UTC" firstStartedPulling="2026-04-17 17:28:57.517630213 +0000 UTC m=+164.092978478" lastFinishedPulling="2026-04-17 17:29:14.310835308 +0000 UTC m=+180.886183576" observedRunningTime="2026-04-17 17:29:14.732598687 +0000 UTC m=+181.307946977" watchObservedRunningTime="2026-04-17 17:29:14.733612455 +0000 UTC m=+181.308960744" Apr 17 17:29:15.626565 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:15.625982 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:29:15.721844 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:15.721802 2576 generic.go:358] "Generic (PLEG): container finished" podID="a3865d5e-8fda-423f-8730-6bbaa0c85e25" containerID="2745cd99c8023c41b2a90ffc01d4098e5120d223141a7ea97e1b9be43b137065" exitCode=0 Apr 17 17:29:15.722258 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:15.721999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zwbtc" event={"ID":"a3865d5e-8fda-423f-8730-6bbaa0c85e25","Type":"ContainerDied","Data":"2745cd99c8023c41b2a90ffc01d4098e5120d223141a7ea97e1b9be43b137065"} Apr 17 17:29:16.122405 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.120701 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-68cbf949c5-kvvxc"] Apr 17 17:29:16.129466 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.128875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.135106 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.134571 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-x7x67\"" Apr 17 17:29:16.135106 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.134640 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 17:29:16.135106 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.134571 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 17:29:16.135106 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.134877 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 17:29:16.135106 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.134888 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:29:16.135669 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.135308 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1r2rpf4qhoq6h\"" Apr 17 17:29:16.139365 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.139219 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-68cbf949c5-kvvxc"] Apr 17 17:29:16.200428 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.200388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9c9abd61-e722-4556-8959-44e58ae5fa17-secret-metrics-server-tls\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.200638 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.200458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c9abd61-e722-4556-8959-44e58ae5fa17-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.200638 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.200489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9c9abd61-e722-4556-8959-44e58ae5fa17-metrics-server-audit-profiles\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.200638 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.200532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9abd61-e722-4556-8959-44e58ae5fa17-client-ca-bundle\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.200638 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.200557 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mlqp\" (UniqueName: \"kubernetes.io/projected/9c9abd61-e722-4556-8959-44e58ae5fa17-kube-api-access-9mlqp\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.200889 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.200642 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9c9abd61-e722-4556-8959-44e58ae5fa17-audit-log\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.200889 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.200694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9c9abd61-e722-4556-8959-44e58ae5fa17-secret-metrics-server-client-certs\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.301945 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.301888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9c9abd61-e722-4556-8959-44e58ae5fa17-audit-log\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.302140 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.301967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9c9abd61-e722-4556-8959-44e58ae5fa17-secret-metrics-server-client-certs\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.302140 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.302044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9c9abd61-e722-4556-8959-44e58ae5fa17-secret-metrics-server-tls\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.302140 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.302093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c9abd61-e722-4556-8959-44e58ae5fa17-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.302140 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.302114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9c9abd61-e722-4556-8959-44e58ae5fa17-metrics-server-audit-profiles\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.302140 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.302138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9abd61-e722-4556-8959-44e58ae5fa17-client-ca-bundle\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.302398 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.302159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mlqp\" (UniqueName: \"kubernetes.io/projected/9c9abd61-e722-4556-8959-44e58ae5fa17-kube-api-access-9mlqp\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.302872 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.302818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/9c9abd61-e722-4556-8959-44e58ae5fa17-audit-log\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.303329 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.303281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c9abd61-e722-4556-8959-44e58ae5fa17-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.303853 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.303807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/9c9abd61-e722-4556-8959-44e58ae5fa17-metrics-server-audit-profiles\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.306219 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.306172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/9c9abd61-e722-4556-8959-44e58ae5fa17-secret-metrics-server-tls\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.307417 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.306609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c9abd61-e722-4556-8959-44e58ae5fa17-client-ca-bundle\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.308389 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.307538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/9c9abd61-e722-4556-8959-44e58ae5fa17-secret-metrics-server-client-certs\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.321565 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.321506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mlqp\" (UniqueName: \"kubernetes.io/projected/9c9abd61-e722-4556-8959-44e58ae5fa17-kube-api-access-9mlqp\") pod \"metrics-server-68cbf949c5-kvvxc\" (UID: \"9c9abd61-e722-4556-8959-44e58ae5fa17\") " pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.412642 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.412547 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76c6ccf79c-dq9rn"] Apr 17 17:29:16.441671 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.441624 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c6ccf79c-dq9rn"] Apr 17 17:29:16.441847 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.441716 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.453756 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.453262 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:29:16.460106 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.460070 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:16.480123 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.479918 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68866946cf-4l6t4"] Apr 17 17:29:16.605981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.605936 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-console-config\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.606169 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.606007 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-oauth-config\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.606169 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.606129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-service-ca\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.606169 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.606165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljjv2\" (UniqueName: \"kubernetes.io/projected/d041b984-995a-4365-be52-c5988078785c-kube-api-access-ljjv2\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.606338 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.606222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-trusted-ca-bundle\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.606338 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.606268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-oauth-serving-cert\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.606437 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.606339 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-serving-cert\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.646613 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.646565 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-68cbf949c5-kvvxc"] Apr 17 17:29:16.652201 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:29:16.652152 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c9abd61_e722_4556_8959_44e58ae5fa17.slice/crio-774c3e0fc3ba6896697c7b65457c0841071b0e949622cb8abc69b886b56d2c5b WatchSource:0}: Error finding container 774c3e0fc3ba6896697c7b65457c0841071b0e949622cb8abc69b886b56d2c5b: Status 404 returned error can't find the container with id 774c3e0fc3ba6896697c7b65457c0841071b0e949622cb8abc69b886b56d2c5b Apr 17 17:29:16.707241 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.707146 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-trusted-ca-bundle\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.707241 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.707219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-oauth-serving-cert\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.707500 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.707272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-serving-cert\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.708069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.707696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-console-config\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.708069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.707759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-oauth-config\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.708069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.707825 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-service-ca\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.708069 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.707860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljjv2\" (UniqueName: \"kubernetes.io/projected/d041b984-995a-4365-be52-c5988078785c-kube-api-access-ljjv2\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.708342 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.708243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-trusted-ca-bundle\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.708429 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.708409 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-oauth-serving-cert\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.708487 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.708443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-console-config\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.708991 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.708964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-service-ca\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.710843 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.710821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-serving-cert\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.711070 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.711045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-oauth-config\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.716531 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.716506 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljjv2\" (UniqueName: \"kubernetes.io/projected/d041b984-995a-4365-be52-c5988078785c-kube-api-access-ljjv2\") pod \"console-76c6ccf79c-dq9rn\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.726399 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.726358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" event={"ID":"9c9abd61-e722-4556-8959-44e58ae5fa17","Type":"ContainerStarted","Data":"774c3e0fc3ba6896697c7b65457c0841071b0e949622cb8abc69b886b56d2c5b"} Apr 17 17:29:16.728721 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.728689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zwbtc" event={"ID":"a3865d5e-8fda-423f-8730-6bbaa0c85e25","Type":"ContainerStarted","Data":"28ea2d89a0b6116128473057a8e557b7d40b3b473ba0311ff126d6c2753aa992"} Apr 17 17:29:16.728841 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.728727 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zwbtc" event={"ID":"a3865d5e-8fda-423f-8730-6bbaa0c85e25","Type":"ContainerStarted","Data":"b03ce2043f3187b433e4882b49caca90e105c681b38e1017fa4292b68e6560bc"} Apr 17 17:29:16.730923 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.730895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" event={"ID":"93f4c3f7-38c9-4335-a216-1b515e0a943a","Type":"ContainerStarted","Data":"c3f00bbd86efba28e717271243c7b83f0c38c7bd7156044357aa1ff2bbcebc29"} Apr 17 17:29:16.731070 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.730929 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" event={"ID":"93f4c3f7-38c9-4335-a216-1b515e0a943a","Type":"ContainerStarted","Data":"99a1269e7a0151ad139b180d8594fc0bd3ca7023b24abb4fccbb58d5ca6d2eea"} Apr 17 17:29:16.731070 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.730944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" event={"ID":"93f4c3f7-38c9-4335-a216-1b515e0a943a","Type":"ContainerStarted","Data":"40d827588b739dcc9aed0589415084224e602b2a96b806aff3ed5ab25dcf59fb"} Apr 17 17:29:16.746676 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.746612 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zwbtc" podStartSLOduration=4.923646722 podStartE2EDuration="5.746592734s" podCreationTimestamp="2026-04-17 17:29:11 +0000 UTC" firstStartedPulling="2026-04-17 17:29:14.254495719 +0000 UTC m=+180.829843992" lastFinishedPulling="2026-04-17 17:29:15.077441733 +0000 UTC m=+181.652790004" observedRunningTime="2026-04-17 17:29:16.746306548 +0000 UTC m=+183.321654849" watchObservedRunningTime="2026-04-17 17:29:16.746592734 +0000 UTC m=+183.321941022" Apr 17 17:29:16.756842 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.756802 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:16.773724 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.773660 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-tf5x4" podStartSLOduration=4.230647258 podStartE2EDuration="5.773638216s" podCreationTimestamp="2026-04-17 17:29:11 +0000 UTC" firstStartedPulling="2026-04-17 17:29:14.377998764 +0000 UTC m=+180.953347030" lastFinishedPulling="2026-04-17 17:29:15.92098955 +0000 UTC m=+182.496337988" observedRunningTime="2026-04-17 17:29:16.772958914 +0000 UTC m=+183.348307202" watchObservedRunningTime="2026-04-17 17:29:16.773638216 +0000 UTC m=+183.348986505" Apr 17 17:29:16.921544 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:16.920406 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c6ccf79c-dq9rn"] Apr 17 17:29:18.165599 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:29:18.165553 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd041b984_995a_4365_be52_c5988078785c.slice/crio-288ada500c4b03595e535683f346874f272fa0279af531669481a4c363f94948 WatchSource:0}: Error finding container 288ada500c4b03595e535683f346874f272fa0279af531669481a4c363f94948: Status 404 returned error can't find the container with id 288ada500c4b03595e535683f346874f272fa0279af531669481a4c363f94948 Apr 17 17:29:18.741423 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:18.741381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c6ccf79c-dq9rn" event={"ID":"d041b984-995a-4365-be52-c5988078785c","Type":"ContainerStarted","Data":"288ada500c4b03595e535683f346874f272fa0279af531669481a4c363f94948"} Apr 17 17:29:19.747297 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:19.747259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fbbc4dd79-ph4hv" event={"ID":"a8db86cb-e997-48e4-86f8-4c182678a152","Type":"ContainerStarted","Data":"4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd"} Apr 17 17:29:19.748908 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:19.748878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c6ccf79c-dq9rn" event={"ID":"d041b984-995a-4365-be52-c5988078785c","Type":"ContainerStarted","Data":"7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705"} Apr 17 17:29:19.766005 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:19.765948 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fbbc4dd79-ph4hv" podStartSLOduration=8.578163386 podStartE2EDuration="12.76592938s" podCreationTimestamp="2026-04-17 17:29:07 +0000 UTC" firstStartedPulling="2026-04-17 17:29:14.396757142 +0000 UTC m=+180.972105408" lastFinishedPulling="2026-04-17 17:29:18.584523119 +0000 UTC m=+185.159871402" observedRunningTime="2026-04-17 17:29:19.763603601 +0000 UTC m=+186.338951895" watchObservedRunningTime="2026-04-17 17:29:19.76592938 +0000 UTC m=+186.341277667" Apr 17 17:29:19.780474 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:19.780412 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76c6ccf79c-dq9rn" podStartSLOduration=3.231409075 podStartE2EDuration="3.780393232s" podCreationTimestamp="2026-04-17 17:29:16 +0000 UTC" firstStartedPulling="2026-04-17 17:29:18.16810482 +0000 UTC m=+184.743453100" lastFinishedPulling="2026-04-17 17:29:18.717088976 +0000 UTC m=+185.292437257" observedRunningTime="2026-04-17 17:29:19.779844355 +0000 UTC m=+186.355192669" watchObservedRunningTime="2026-04-17 17:29:19.780393232 +0000 UTC m=+186.355741520" Apr 17 17:29:20.754145 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:20.754105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" event={"ID":"9c9abd61-e722-4556-8959-44e58ae5fa17","Type":"ContainerStarted","Data":"301310654e59a374dcc68d6bfeea6a9c4d6e1232d6c231190955ce77b1032648"} Apr 17 17:29:20.773450 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:20.773385 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" podStartSLOduration=1.201258576 podStartE2EDuration="4.773370226s" podCreationTimestamp="2026-04-17 17:29:16 +0000 UTC" firstStartedPulling="2026-04-17 17:29:16.654592024 +0000 UTC m=+183.229940289" lastFinishedPulling="2026-04-17 17:29:20.22670367 +0000 UTC m=+186.802051939" observedRunningTime="2026-04-17 17:29:20.77120009 +0000 UTC m=+187.346548408" watchObservedRunningTime="2026-04-17 17:29:20.773370226 +0000 UTC m=+187.348718520" Apr 17 17:29:26.757002 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:26.756953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:26.757503 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:26.757049 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:26.761973 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:26.761948 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:26.775747 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:26.775721 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:29:26.825882 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:26.825841 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fbbc4dd79-ph4hv"] Apr 17 17:29:27.942317 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:27.942282 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:36.460585 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:36.460544 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:36.461074 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:36.460638 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:41.512473 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.512413 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" podUID="860ed815-44b1-4158-af77-fb201acf8cbf" containerName="registry" containerID="cri-o://498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3" gracePeriod=30 Apr 17 17:29:41.754648 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.754622 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:29:41.816334 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.816254 2576 generic.go:358] "Generic (PLEG): container finished" podID="860ed815-44b1-4158-af77-fb201acf8cbf" containerID="498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3" exitCode=0 Apr 17 17:29:41.816334 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.816295 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" event={"ID":"860ed815-44b1-4158-af77-fb201acf8cbf","Type":"ContainerDied","Data":"498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3"} Apr 17 17:29:41.816334 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.816318 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" Apr 17 17:29:41.816334 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.816345 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-68866946cf-4l6t4" event={"ID":"860ed815-44b1-4158-af77-fb201acf8cbf","Type":"ContainerDied","Data":"655e75aeb6f944fdbce081ed9f4a329ccaa9eddb2bd90dad4c02b0302e195038"} Apr 17 17:29:41.816627 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.816362 2576 scope.go:117] "RemoveContainer" containerID="498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3" Apr 17 17:29:41.824189 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.824116 2576 scope.go:117] "RemoveContainer" containerID="498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3" Apr 17 17:29:41.824432 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:29:41.824405 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3\": container with ID starting with 498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3 not found: ID does not exist" containerID="498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3" Apr 17 17:29:41.824493 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.824442 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3"} err="failed to get container status \"498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3\": rpc error: code = NotFound desc = could not find container \"498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3\": container with ID starting with 498933d402ae44fb7132a1f3f9e4180942ec250d45ce868ded0c051ad6d6dec3 not found: ID does not exist" Apr 17 17:29:41.840735 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.840699 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-trusted-ca\") pod \"860ed815-44b1-4158-af77-fb201acf8cbf\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " Apr 17 17:29:41.840876 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.840761 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-registry-certificates\") pod \"860ed815-44b1-4158-af77-fb201acf8cbf\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " Apr 17 17:29:41.840876 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.840844 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd85j\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-kube-api-access-pd85j\") pod \"860ed815-44b1-4158-af77-fb201acf8cbf\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " Apr 17 17:29:41.840987 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.840877 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") pod \"860ed815-44b1-4158-af77-fb201acf8cbf\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " Apr 17 17:29:41.840987 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.840913 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-bound-sa-token\") pod \"860ed815-44b1-4158-af77-fb201acf8cbf\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " Apr 17 17:29:41.840987 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.840944 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/860ed815-44b1-4158-af77-fb201acf8cbf-ca-trust-extracted\") pod \"860ed815-44b1-4158-af77-fb201acf8cbf\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " Apr 17 17:29:41.841184 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.841067 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-installation-pull-secrets\") pod \"860ed815-44b1-4158-af77-fb201acf8cbf\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " Apr 17 17:29:41.841184 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.841105 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-image-registry-private-configuration\") pod \"860ed815-44b1-4158-af77-fb201acf8cbf\" (UID: \"860ed815-44b1-4158-af77-fb201acf8cbf\") " Apr 17 17:29:41.841388 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.841212 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "860ed815-44b1-4158-af77-fb201acf8cbf" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:41.841477 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.841382 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "860ed815-44b1-4158-af77-fb201acf8cbf" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:41.843678 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.843647 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "860ed815-44b1-4158-af77-fb201acf8cbf" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:41.843678 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.843659 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "860ed815-44b1-4158-af77-fb201acf8cbf" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:41.843843 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.843762 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "860ed815-44b1-4158-af77-fb201acf8cbf" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:41.843843 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.843771 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "860ed815-44b1-4158-af77-fb201acf8cbf" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:41.843920 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.843848 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-kube-api-access-pd85j" (OuterVolumeSpecName: "kube-api-access-pd85j") pod "860ed815-44b1-4158-af77-fb201acf8cbf" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf"). InnerVolumeSpecName "kube-api-access-pd85j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:41.849532 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.849506 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860ed815-44b1-4158-af77-fb201acf8cbf-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "860ed815-44b1-4158-af77-fb201acf8cbf" (UID: "860ed815-44b1-4158-af77-fb201acf8cbf"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:29:41.942800 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.942760 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pd85j\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-kube-api-access-pd85j\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:41.942800 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.942791 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-registry-tls\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:41.942800 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.942802 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/860ed815-44b1-4158-af77-fb201acf8cbf-bound-sa-token\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:41.942800 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.942813 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/860ed815-44b1-4158-af77-fb201acf8cbf-ca-trust-extracted\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:41.943098 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.942822 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-installation-pull-secrets\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:41.943098 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.942831 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/860ed815-44b1-4158-af77-fb201acf8cbf-image-registry-private-configuration\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:41.943098 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.942841 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-trusted-ca\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:41.943098 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:41.942851 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/860ed815-44b1-4158-af77-fb201acf8cbf-registry-certificates\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:42.133958 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:42.133871 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68866946cf-4l6t4"] Apr 17 17:29:42.137096 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:42.137069 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-68866946cf-4l6t4"] Apr 17 17:29:44.000459 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:44.000420 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="860ed815-44b1-4158-af77-fb201acf8cbf" path="/var/lib/kubelet/pods/860ed815-44b1-4158-af77-fb201acf8cbf/volumes" Apr 17 17:29:51.845567 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:51.845521 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fbbc4dd79-ph4hv" podUID="a8db86cb-e997-48e4-86f8-4c182678a152" containerName="console" containerID="cri-o://4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd" gracePeriod=15 Apr 17 17:29:52.159970 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.159944 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fbbc4dd79-ph4hv_a8db86cb-e997-48e4-86f8-4c182678a152/console/0.log" Apr 17 17:29:52.160139 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.160016 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:52.235124 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.235089 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-console-config\") pod \"a8db86cb-e997-48e4-86f8-4c182678a152\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " Apr 17 17:29:52.235327 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.235154 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-oauth-serving-cert\") pod \"a8db86cb-e997-48e4-86f8-4c182678a152\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " Apr 17 17:29:52.235327 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.235194 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-service-ca\") pod \"a8db86cb-e997-48e4-86f8-4c182678a152\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " Apr 17 17:29:52.235327 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.235224 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twpdx\" (UniqueName: \"kubernetes.io/projected/a8db86cb-e997-48e4-86f8-4c182678a152-kube-api-access-twpdx\") pod \"a8db86cb-e997-48e4-86f8-4c182678a152\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " Apr 17 17:29:52.235327 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.235253 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-oauth-config\") pod \"a8db86cb-e997-48e4-86f8-4c182678a152\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " Apr 17 17:29:52.235327 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.235295 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-serving-cert\") pod \"a8db86cb-e997-48e4-86f8-4c182678a152\" (UID: \"a8db86cb-e997-48e4-86f8-4c182678a152\") " Apr 17 17:29:52.235655 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.235623 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-console-config" (OuterVolumeSpecName: "console-config") pod "a8db86cb-e997-48e4-86f8-4c182678a152" (UID: "a8db86cb-e997-48e4-86f8-4c182678a152"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:52.235729 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.235631 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-service-ca" (OuterVolumeSpecName: "service-ca") pod "a8db86cb-e997-48e4-86f8-4c182678a152" (UID: "a8db86cb-e997-48e4-86f8-4c182678a152"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:52.235729 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.235676 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a8db86cb-e997-48e4-86f8-4c182678a152" (UID: "a8db86cb-e997-48e4-86f8-4c182678a152"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:29:52.237637 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.237602 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a8db86cb-e997-48e4-86f8-4c182678a152" (UID: "a8db86cb-e997-48e4-86f8-4c182678a152"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:52.237743 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.237648 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a8db86cb-e997-48e4-86f8-4c182678a152" (UID: "a8db86cb-e997-48e4-86f8-4c182678a152"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:29:52.237743 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.237673 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8db86cb-e997-48e4-86f8-4c182678a152-kube-api-access-twpdx" (OuterVolumeSpecName: "kube-api-access-twpdx") pod "a8db86cb-e997-48e4-86f8-4c182678a152" (UID: "a8db86cb-e997-48e4-86f8-4c182678a152"). InnerVolumeSpecName "kube-api-access-twpdx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:29:52.336431 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.336390 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-oauth-config\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:52.336431 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.336424 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8db86cb-e997-48e4-86f8-4c182678a152-console-serving-cert\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:52.336431 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.336435 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-console-config\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:52.336660 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.336444 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-oauth-serving-cert\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:52.336660 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.336454 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8db86cb-e997-48e4-86f8-4c182678a152-service-ca\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:52.336660 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.336463 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twpdx\" (UniqueName: \"kubernetes.io/projected/a8db86cb-e997-48e4-86f8-4c182678a152-kube-api-access-twpdx\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:29:52.849759 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.849726 2576 generic.go:358] "Generic (PLEG): container finished" podID="32d852f3-1d88-49d6-93e3-d36b1a499102" containerID="b4ee937f269bf069443137c2c14deebe25d14a1498a31fbdae67529ab5142119" exitCode=0 Apr 17 17:29:52.850266 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.849799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vz885" event={"ID":"32d852f3-1d88-49d6-93e3-d36b1a499102","Type":"ContainerDied","Data":"b4ee937f269bf069443137c2c14deebe25d14a1498a31fbdae67529ab5142119"} Apr 17 17:29:52.850266 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.850229 2576 scope.go:117] "RemoveContainer" containerID="b4ee937f269bf069443137c2c14deebe25d14a1498a31fbdae67529ab5142119" Apr 17 17:29:52.850990 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.850940 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fbbc4dd79-ph4hv_a8db86cb-e997-48e4-86f8-4c182678a152/console/0.log" Apr 17 17:29:52.850990 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.850972 2576 generic.go:358] "Generic (PLEG): container finished" podID="a8db86cb-e997-48e4-86f8-4c182678a152" containerID="4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd" exitCode=2 Apr 17 17:29:52.851126 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.851049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fbbc4dd79-ph4hv" event={"ID":"a8db86cb-e997-48e4-86f8-4c182678a152","Type":"ContainerDied","Data":"4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd"} Apr 17 17:29:52.851126 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.851065 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fbbc4dd79-ph4hv" Apr 17 17:29:52.851126 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.851082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fbbc4dd79-ph4hv" event={"ID":"a8db86cb-e997-48e4-86f8-4c182678a152","Type":"ContainerDied","Data":"057c466efc8ac532bfa9fa22cb173c0db528171703b5a9f4aebb794f1fa0cd43"} Apr 17 17:29:52.851126 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.851100 2576 scope.go:117] "RemoveContainer" containerID="4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd" Apr 17 17:29:52.860908 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.860886 2576 scope.go:117] "RemoveContainer" containerID="4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd" Apr 17 17:29:52.861226 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:29:52.861207 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd\": container with ID starting with 4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd not found: ID does not exist" containerID="4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd" Apr 17 17:29:52.861295 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.861240 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd"} err="failed to get container status \"4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd\": rpc error: code = NotFound desc = could not find container \"4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd\": container with ID starting with 4bfb52ded9d49d0efbc2865458a1f4748f035944e7987880e4bccf0b3968b3fd not found: ID does not exist" Apr 17 17:29:52.882132 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.882096 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fbbc4dd79-ph4hv"] Apr 17 17:29:52.897003 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:52.894278 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fbbc4dd79-ph4hv"] Apr 17 17:29:53.855293 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:53.855258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-vz885" event={"ID":"32d852f3-1d88-49d6-93e3-d36b1a499102","Type":"ContainerStarted","Data":"938b28cbef97ba1273ef513e76a0a6c613be48a1d3734ab2220ba2bd008416e7"} Apr 17 17:29:54.001152 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:54.001120 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8db86cb-e997-48e4-86f8-4c182678a152" path="/var/lib/kubelet/pods/a8db86cb-e997-48e4-86f8-4c182678a152/volumes" Apr 17 17:29:56.466258 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:56.466223 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:56.470431 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:56.470404 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-68cbf949c5-kvvxc" Apr 17 17:29:57.870327 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:57.870291 2576 generic.go:358] "Generic (PLEG): container finished" podID="4ef3c872-c400-4d98-9028-72e95653a455" containerID="1087e54b941c08ad943da4c7c89854a0ca67b37de49cc1838f042e9d5bedaf41" exitCode=0 Apr 17 17:29:57.870764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:57.870337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" event={"ID":"4ef3c872-c400-4d98-9028-72e95653a455","Type":"ContainerDied","Data":"1087e54b941c08ad943da4c7c89854a0ca67b37de49cc1838f042e9d5bedaf41"} Apr 17 17:29:57.870764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:57.870596 2576 scope.go:117] "RemoveContainer" containerID="1087e54b941c08ad943da4c7c89854a0ca67b37de49cc1838f042e9d5bedaf41" Apr 17 17:29:58.875878 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:29:58.875841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-hp4p9" event={"ID":"4ef3c872-c400-4d98-9028-72e95653a455","Type":"ContainerStarted","Data":"2331b015dc41bba5a729e23a06523904dcd288d4c8b8586864578a7be8dde3e4"} Apr 17 17:30:24.797046 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:24.796989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:30:24.799369 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:24.799337 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c56ede72-4e1e-4a75-9ebe-eabfdfcd2065-metrics-certs\") pod \"network-metrics-daemon-l28wh\" (UID: \"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065\") " pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:30:25.100761 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:25.100672 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q7kb6\"" Apr 17 17:30:25.108034 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:25.108002 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l28wh" Apr 17 17:30:25.242302 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:25.242274 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l28wh"] Apr 17 17:30:25.244963 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:30:25.244930 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc56ede72_4e1e_4a75_9ebe_eabfdfcd2065.slice/crio-e8145e0be3f737a6d481ddd92384616f57589f289700c617e9fdbf012c58218b WatchSource:0}: Error finding container e8145e0be3f737a6d481ddd92384616f57589f289700c617e9fdbf012c58218b: Status 404 returned error can't find the container with id e8145e0be3f737a6d481ddd92384616f57589f289700c617e9fdbf012c58218b Apr 17 17:30:25.960905 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:25.960860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l28wh" event={"ID":"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065","Type":"ContainerStarted","Data":"e8145e0be3f737a6d481ddd92384616f57589f289700c617e9fdbf012c58218b"} Apr 17 17:30:26.965330 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:26.965294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l28wh" event={"ID":"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065","Type":"ContainerStarted","Data":"3355f5cfbfb244357b8960971c1fbf17fffc45cf0fd73ca50fc712a821d9772b"} Apr 17 17:30:26.965330 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:26.965333 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l28wh" event={"ID":"c56ede72-4e1e-4a75-9ebe-eabfdfcd2065","Type":"ContainerStarted","Data":"75b5f4710076f800e9b57f9e38689e8bcc9adfbbcbb1e83d3e76d24380eae82d"} Apr 17 17:30:26.983015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:26.982942 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l28wh" podStartSLOduration=251.828695856 podStartE2EDuration="4m12.982921934s" podCreationTimestamp="2026-04-17 17:26:14 +0000 UTC" firstStartedPulling="2026-04-17 17:30:25.24690368 +0000 UTC m=+251.822251945" lastFinishedPulling="2026-04-17 17:30:26.401129754 +0000 UTC m=+252.976478023" observedRunningTime="2026-04-17 17:30:26.982669501 +0000 UTC m=+253.558017795" watchObservedRunningTime="2026-04-17 17:30:26.982921934 +0000 UTC m=+253.558270224" Apr 17 17:30:39.390874 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.390791 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76c7bcb89d-djbx8"] Apr 17 17:30:39.391443 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.391176 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8db86cb-e997-48e4-86f8-4c182678a152" containerName="console" Apr 17 17:30:39.391443 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.391196 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8db86cb-e997-48e4-86f8-4c182678a152" containerName="console" Apr 17 17:30:39.391443 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.391229 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="860ed815-44b1-4158-af77-fb201acf8cbf" containerName="registry" Apr 17 17:30:39.391443 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.391238 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="860ed815-44b1-4158-af77-fb201acf8cbf" containerName="registry" Apr 17 17:30:39.391443 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.391318 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="860ed815-44b1-4158-af77-fb201acf8cbf" containerName="registry" Apr 17 17:30:39.391443 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.391332 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8db86cb-e997-48e4-86f8-4c182678a152" containerName="console" Apr 17 17:30:39.395823 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.395798 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.407645 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.407619 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c7bcb89d-djbx8"] Apr 17 17:30:39.519360 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.519312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-oauth-serving-cert\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.519360 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.519360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-trusted-ca-bundle\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.519569 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.519396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-serving-cert\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.519569 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.519432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-oauth-config\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.519569 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.519503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-config\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.519569 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.519544 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5db\" (UniqueName: \"kubernetes.io/projected/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-kube-api-access-rv5db\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.519697 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.519571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-service-ca\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.619969 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.619927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-trusted-ca-bundle\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.619969 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.619970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-serving-cert\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.620273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.620014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-oauth-config\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.620273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.620067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-config\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.620273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.620089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5db\" (UniqueName: \"kubernetes.io/projected/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-kube-api-access-rv5db\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.620273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.620119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-service-ca\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.620273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.620148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-oauth-serving-cert\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.620840 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.620812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-config\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.620960 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.620888 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-oauth-serving-cert\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.621015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.620970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-service-ca\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.621171 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.621152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-trusted-ca-bundle\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.623180 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.623158 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-serving-cert\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.623276 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.623262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-oauth-config\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.628034 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.628005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5db\" (UniqueName: \"kubernetes.io/projected/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-kube-api-access-rv5db\") pod \"console-76c7bcb89d-djbx8\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.705786 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.705700 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:39.831573 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:39.831461 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c7bcb89d-djbx8"] Apr 17 17:30:39.834407 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:30:39.834370 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebc8aa9c_97cc_49cb_a0bf_ad3829e46f2f.slice/crio-d4815c708659590813887172c0160194efae9c50a6051f72dd0e8a7b779fe4e4 WatchSource:0}: Error finding container d4815c708659590813887172c0160194efae9c50a6051f72dd0e8a7b779fe4e4: Status 404 returned error can't find the container with id d4815c708659590813887172c0160194efae9c50a6051f72dd0e8a7b779fe4e4 Apr 17 17:30:40.005783 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:40.005744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c7bcb89d-djbx8" event={"ID":"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f","Type":"ContainerStarted","Data":"4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42"} Apr 17 17:30:40.005783 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:40.005787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c7bcb89d-djbx8" event={"ID":"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f","Type":"ContainerStarted","Data":"d4815c708659590813887172c0160194efae9c50a6051f72dd0e8a7b779fe4e4"} Apr 17 17:30:40.025334 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:40.025280 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76c7bcb89d-djbx8" podStartSLOduration=1.025262825 podStartE2EDuration="1.025262825s" podCreationTimestamp="2026-04-17 17:30:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:30:40.024255581 +0000 UTC m=+266.599603868" watchObservedRunningTime="2026-04-17 17:30:40.025262825 +0000 UTC m=+266.600611116" Apr 17 17:30:49.706850 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:49.706813 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:49.706850 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:49.706858 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:49.711999 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:49.711973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:50.037986 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:50.037953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:30:50.082377 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:30:50.082339 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c6ccf79c-dq9rn"] Apr 17 17:31:13.899591 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:13.899559 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:31:13.900302 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:13.899600 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:31:13.907356 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:13.907330 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:31:13.907540 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:13.907409 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:31:13.910513 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:13.910491 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:31:15.108166 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.108128 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76c6ccf79c-dq9rn" podUID="d041b984-995a-4365-be52-c5988078785c" containerName="console" containerID="cri-o://7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705" gracePeriod=15 Apr 17 17:31:15.342834 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.342809 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c6ccf79c-dq9rn_d041b984-995a-4365-be52-c5988078785c/console/0.log" Apr 17 17:31:15.342980 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.342873 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:31:15.504881 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.504848 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-service-ca\") pod \"d041b984-995a-4365-be52-c5988078785c\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " Apr 17 17:31:15.505125 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.504965 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-console-config\") pod \"d041b984-995a-4365-be52-c5988078785c\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " Apr 17 17:31:15.505125 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.504996 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-oauth-serving-cert\") pod \"d041b984-995a-4365-be52-c5988078785c\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " Apr 17 17:31:15.505125 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.505047 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-serving-cert\") pod \"d041b984-995a-4365-be52-c5988078785c\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " Apr 17 17:31:15.505125 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.505078 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljjv2\" (UniqueName: \"kubernetes.io/projected/d041b984-995a-4365-be52-c5988078785c-kube-api-access-ljjv2\") pod \"d041b984-995a-4365-be52-c5988078785c\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " Apr 17 17:31:15.505125 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.505120 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-oauth-config\") pod \"d041b984-995a-4365-be52-c5988078785c\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " Apr 17 17:31:15.505385 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.505169 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-trusted-ca-bundle\") pod \"d041b984-995a-4365-be52-c5988078785c\" (UID: \"d041b984-995a-4365-be52-c5988078785c\") " Apr 17 17:31:15.505385 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.505236 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-service-ca" (OuterVolumeSpecName: "service-ca") pod "d041b984-995a-4365-be52-c5988078785c" (UID: "d041b984-995a-4365-be52-c5988078785c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:15.505486 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.505427 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-service-ca\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:31:15.505486 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.505450 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-console-config" (OuterVolumeSpecName: "console-config") pod "d041b984-995a-4365-be52-c5988078785c" (UID: "d041b984-995a-4365-be52-c5988078785c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:15.505585 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.505458 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d041b984-995a-4365-be52-c5988078785c" (UID: "d041b984-995a-4365-be52-c5988078785c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:15.505686 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.505666 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d041b984-995a-4365-be52-c5988078785c" (UID: "d041b984-995a-4365-be52-c5988078785c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:31:15.507267 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.507241 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d041b984-995a-4365-be52-c5988078785c" (UID: "d041b984-995a-4365-be52-c5988078785c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:31:15.507379 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.507273 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d041b984-995a-4365-be52-c5988078785c" (UID: "d041b984-995a-4365-be52-c5988078785c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:31:15.507379 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.507346 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d041b984-995a-4365-be52-c5988078785c-kube-api-access-ljjv2" (OuterVolumeSpecName: "kube-api-access-ljjv2") pod "d041b984-995a-4365-be52-c5988078785c" (UID: "d041b984-995a-4365-be52-c5988078785c"). InnerVolumeSpecName "kube-api-access-ljjv2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:15.605862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.605815 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-console-config\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:31:15.605862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.605853 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-oauth-serving-cert\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:31:15.605862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.605864 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-serving-cert\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:31:15.605862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.605874 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ljjv2\" (UniqueName: \"kubernetes.io/projected/d041b984-995a-4365-be52-c5988078785c-kube-api-access-ljjv2\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:31:15.606177 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.605884 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d041b984-995a-4365-be52-c5988078785c-console-oauth-config\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:31:15.606177 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:15.605893 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d041b984-995a-4365-be52-c5988078785c-trusted-ca-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:31:16.106570 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.106541 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c6ccf79c-dq9rn_d041b984-995a-4365-be52-c5988078785c/console/0.log" Apr 17 17:31:16.106736 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.106585 2576 generic.go:358] "Generic (PLEG): container finished" podID="d041b984-995a-4365-be52-c5988078785c" containerID="7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705" exitCode=2 Apr 17 17:31:16.106736 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.106664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c6ccf79c-dq9rn" event={"ID":"d041b984-995a-4365-be52-c5988078785c","Type":"ContainerDied","Data":"7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705"} Apr 17 17:31:16.106736 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.106672 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c6ccf79c-dq9rn" Apr 17 17:31:16.106736 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.106698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c6ccf79c-dq9rn" event={"ID":"d041b984-995a-4365-be52-c5988078785c","Type":"ContainerDied","Data":"288ada500c4b03595e535683f346874f272fa0279af531669481a4c363f94948"} Apr 17 17:31:16.106736 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.106716 2576 scope.go:117] "RemoveContainer" containerID="7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705" Apr 17 17:31:16.119618 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.119459 2576 scope.go:117] "RemoveContainer" containerID="7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705" Apr 17 17:31:16.119846 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:31:16.119741 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705\": container with ID starting with 7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705 not found: ID does not exist" containerID="7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705" Apr 17 17:31:16.119846 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.119766 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705"} err="failed to get container status \"7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705\": rpc error: code = NotFound desc = could not find container \"7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705\": container with ID starting with 7239ff91d7561ae4231f4e2c0419bdfbcfc6a292df150ae91b969417bdefe705 not found: ID does not exist" Apr 17 17:31:16.124521 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.124498 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c6ccf79c-dq9rn"] Apr 17 17:31:16.130104 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:16.130065 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76c6ccf79c-dq9rn"] Apr 17 17:31:18.000769 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:18.000729 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d041b984-995a-4365-be52-c5988078785c" path="/var/lib/kubelet/pods/d041b984-995a-4365-be52-c5988078785c/volumes" Apr 17 17:31:57.290694 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.290652 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67999b787-qb6tf"] Apr 17 17:31:57.291266 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.291095 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d041b984-995a-4365-be52-c5988078785c" containerName="console" Apr 17 17:31:57.291266 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.291115 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d041b984-995a-4365-be52-c5988078785c" containerName="console" Apr 17 17:31:57.291266 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.291186 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d041b984-995a-4365-be52-c5988078785c" containerName="console" Apr 17 17:31:57.294113 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.294089 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.307064 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.307033 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67999b787-qb6tf"] Apr 17 17:31:57.320290 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.320257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-service-ca\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.320290 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.320293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlrwn\" (UniqueName: \"kubernetes.io/projected/1de51bba-e145-42a6-842c-a3db6e008dd4-kube-api-access-jlrwn\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.320499 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.320320 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-trusted-ca-bundle\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.320499 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.320372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-oauth-serving-cert\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.320499 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.320431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-oauth-config\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.320499 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.320472 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-console-config\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.320499 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.320496 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-serving-cert\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.421782 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.421742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-service-ca\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.421782 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.421785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jlrwn\" (UniqueName: \"kubernetes.io/projected/1de51bba-e145-42a6-842c-a3db6e008dd4-kube-api-access-jlrwn\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.422094 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.421803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-trusted-ca-bundle\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.422094 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.421822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-oauth-serving-cert\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.422094 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.421872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-oauth-config\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.422094 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.421894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-console-config\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.422094 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.421925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-serving-cert\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.422661 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.422628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-service-ca\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.422755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.422682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-oauth-serving-cert\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.422833 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.422810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-console-config\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.422948 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.422932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-trusted-ca-bundle\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.424406 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.424386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-serving-cert\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.424494 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.424440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-oauth-config\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.431349 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.431325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlrwn\" (UniqueName: \"kubernetes.io/projected/1de51bba-e145-42a6-842c-a3db6e008dd4-kube-api-access-jlrwn\") pod \"console-67999b787-qb6tf\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.608333 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.608238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:31:57.731907 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.731815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67999b787-qb6tf"] Apr 17 17:31:57.734685 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:31:57.734655 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1de51bba_e145_42a6_842c_a3db6e008dd4.slice/crio-be7e5264d7039ebc21a61ee75a50f82adc1fb1481697674ad2a4d0df3d892b81 WatchSource:0}: Error finding container be7e5264d7039ebc21a61ee75a50f82adc1fb1481697674ad2a4d0df3d892b81: Status 404 returned error can't find the container with id be7e5264d7039ebc21a61ee75a50f82adc1fb1481697674ad2a4d0df3d892b81 Apr 17 17:31:57.736551 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:57.736535 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:31:58.231750 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:58.231711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67999b787-qb6tf" event={"ID":"1de51bba-e145-42a6-842c-a3db6e008dd4","Type":"ContainerStarted","Data":"fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792"} Apr 17 17:31:58.231750 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:58.231744 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67999b787-qb6tf" event={"ID":"1de51bba-e145-42a6-842c-a3db6e008dd4","Type":"ContainerStarted","Data":"be7e5264d7039ebc21a61ee75a50f82adc1fb1481697674ad2a4d0df3d892b81"} Apr 17 17:31:58.252735 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:31:58.252683 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67999b787-qb6tf" podStartSLOduration=1.252666632 podStartE2EDuration="1.252666632s" podCreationTimestamp="2026-04-17 17:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:31:58.250288754 +0000 UTC m=+344.825637056" watchObservedRunningTime="2026-04-17 17:31:58.252666632 +0000 UTC m=+344.828014920" Apr 17 17:32:07.609090 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:07.608973 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:32:07.609090 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:07.609060 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:32:07.614347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:07.614319 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:32:08.265148 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:08.265116 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:32:08.319389 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:08.319359 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c7bcb89d-djbx8"] Apr 17 17:32:24.709601 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.709564 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs"] Apr 17 17:32:24.714299 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.714279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:24.717060 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.717031 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:32:24.718200 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.718180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5w67t\"" Apr 17 17:32:24.718270 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.718183 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:32:24.722373 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.721760 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs"] Apr 17 17:32:24.852010 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.851963 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbkgq\" (UniqueName: \"kubernetes.io/projected/7495f516-b22d-4a53-9904-6d56254fcb43-kube-api-access-rbkgq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:24.852224 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.852096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:24.852224 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.852135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:24.953282 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.953232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:24.953457 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.953330 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbkgq\" (UniqueName: \"kubernetes.io/projected/7495f516-b22d-4a53-9904-6d56254fcb43-kube-api-access-rbkgq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:24.953457 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.953407 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:24.953728 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.953705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:24.953764 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.953718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:24.962869 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:24.962808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbkgq\" (UniqueName: \"kubernetes.io/projected/7495f516-b22d-4a53-9904-6d56254fcb43-kube-api-access-rbkgq\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:25.026118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:25.026082 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:32:25.151458 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:25.151384 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs"] Apr 17 17:32:25.153798 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:32:25.153760 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7495f516_b22d_4a53_9904_6d56254fcb43.slice/crio-b6daeba58ead5983290b3f22ff0e88596d70e51c8c9b2e69abdc8c5c0c740fd4 WatchSource:0}: Error finding container b6daeba58ead5983290b3f22ff0e88596d70e51c8c9b2e69abdc8c5c0c740fd4: Status 404 returned error can't find the container with id b6daeba58ead5983290b3f22ff0e88596d70e51c8c9b2e69abdc8c5c0c740fd4 Apr 17 17:32:25.308949 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:25.308911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" event={"ID":"7495f516-b22d-4a53-9904-6d56254fcb43","Type":"ContainerStarted","Data":"b6daeba58ead5983290b3f22ff0e88596d70e51c8c9b2e69abdc8c5c0c740fd4"} Apr 17 17:32:30.325896 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:30.325861 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" event={"ID":"7495f516-b22d-4a53-9904-6d56254fcb43","Type":"ContainerStarted","Data":"d34f86bfd281e62a28c244330b1a9e08ca965ab861788e0af705f60ca69a0ff3"} Apr 17 17:32:31.329590 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:31.329556 2576 generic.go:358] "Generic (PLEG): container finished" podID="7495f516-b22d-4a53-9904-6d56254fcb43" containerID="d34f86bfd281e62a28c244330b1a9e08ca965ab861788e0af705f60ca69a0ff3" exitCode=0 Apr 17 17:32:31.329943 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:31.329632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" event={"ID":"7495f516-b22d-4a53-9904-6d56254fcb43","Type":"ContainerDied","Data":"d34f86bfd281e62a28c244330b1a9e08ca965ab861788e0af705f60ca69a0ff3"} Apr 17 17:32:33.340127 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.340065 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76c7bcb89d-djbx8" podUID="ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" containerName="console" containerID="cri-o://4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42" gracePeriod=15 Apr 17 17:32:33.580144 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.580120 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c7bcb89d-djbx8_ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f/console/0.log" Apr 17 17:32:33.580284 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.580190 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:32:33.729060 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.728946 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-trusted-ca-bundle\") pod \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " Apr 17 17:32:33.729221 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729073 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-service-ca\") pod \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " Apr 17 17:32:33.729221 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729125 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-serving-cert\") pod \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " Apr 17 17:32:33.729221 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729152 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-oauth-serving-cert\") pod \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " Apr 17 17:32:33.729372 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729285 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-oauth-config\") pod \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " Apr 17 17:32:33.729372 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729346 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-config\") pod \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " Apr 17 17:32:33.729473 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729391 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv5db\" (UniqueName: \"kubernetes.io/projected/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-kube-api-access-rv5db\") pod \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\" (UID: \"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f\") " Apr 17 17:32:33.729520 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729462 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" (UID: "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:33.729568 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729531 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" (UID: "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:33.729568 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729543 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-service-ca" (OuterVolumeSpecName: "service-ca") pod "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" (UID: "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:33.729760 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729727 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-config" (OuterVolumeSpecName: "console-config") pod "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" (UID: "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:32:33.729879 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729765 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-trusted-ca-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:32:33.729879 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729783 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-service-ca\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:32:33.729879 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.729796 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-oauth-serving-cert\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:32:33.731582 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.731560 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" (UID: "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:32:33.731582 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.731563 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-kube-api-access-rv5db" (OuterVolumeSpecName: "kube-api-access-rv5db") pod "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" (UID: "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f"). InnerVolumeSpecName "kube-api-access-rv5db". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:32:33.731582 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.731581 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" (UID: "ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:32:33.830807 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.830755 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-serving-cert\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:32:33.830807 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.830797 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-oauth-config\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:32:33.830807 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.830808 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-console-config\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:32:33.830807 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:33.830817 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rv5db\" (UniqueName: \"kubernetes.io/projected/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f-kube-api-access-rv5db\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:32:34.339338 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.339249 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c7bcb89d-djbx8_ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f/console/0.log" Apr 17 17:32:34.339338 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.339296 2576 generic.go:358] "Generic (PLEG): container finished" podID="ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" containerID="4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42" exitCode=2 Apr 17 17:32:34.339523 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.339395 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c7bcb89d-djbx8" Apr 17 17:32:34.339523 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.339391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c7bcb89d-djbx8" event={"ID":"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f","Type":"ContainerDied","Data":"4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42"} Apr 17 17:32:34.339523 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.339503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c7bcb89d-djbx8" event={"ID":"ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f","Type":"ContainerDied","Data":"d4815c708659590813887172c0160194efae9c50a6051f72dd0e8a7b779fe4e4"} Apr 17 17:32:34.339629 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.339523 2576 scope.go:117] "RemoveContainer" containerID="4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42" Apr 17 17:32:34.347435 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.347268 2576 scope.go:117] "RemoveContainer" containerID="4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42" Apr 17 17:32:34.347644 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:32:34.347527 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42\": container with ID starting with 4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42 not found: ID does not exist" containerID="4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42" Apr 17 17:32:34.347644 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.347555 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42"} err="failed to get container status \"4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42\": rpc error: code = NotFound desc = could not find container \"4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42\": container with ID starting with 4e3b37fd2884f1b41c951d09e9a716c0d6e12a5be42f84fd5bec705c418def42 not found: ID does not exist" Apr 17 17:32:34.358363 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.358332 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c7bcb89d-djbx8"] Apr 17 17:32:34.360961 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:34.360929 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76c7bcb89d-djbx8"] Apr 17 17:32:36.001519 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:32:36.001486 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" path="/var/lib/kubelet/pods/ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f/volumes" Apr 17 17:32:41.718007 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:32:41.717955 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908: reading manifest sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908 in registry.redhat.io/cert-manager/cert-manager-operator-bundle: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" image="registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908" Apr 17 17:32:41.718479 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:32:41.718219 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbkgq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000430000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs_openshift-marketplace(7495f516-b22d-4a53-9904-6d56254fcb43): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908: reading manifest sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908 in registry.redhat.io/cert-manager/cert-manager-operator-bundle: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:32:41.719415 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:32:41.719383 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908: reading manifest sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908 in registry.redhat.io/cert-manager/cert-manager-operator-bundle: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" Apr 17 17:32:42.364357 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:32:42.364323 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908: reading manifest sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908 in registry.redhat.io/cert-manager/cert-manager-operator-bundle: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" Apr 17 17:33:04.318885 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:33:04.318832 2576 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908: reading manifest sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908 in registry.redhat.io/cert-manager/cert-manager-operator-bundle: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" image="registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908" Apr 17 17:33:04.319352 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:33:04.318997 2576 kuberuntime_manager.go:1358] "Unhandled Error" err="init container &Container{Name:pull,Image:registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908,Command:[/util/cpb /bundle],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:bundle,ReadOnly:false,MountPath:/bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:util,ReadOnly:false,MountPath:/util,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rbkgq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000430000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod 925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs_openshift-marketplace(7495f516-b22d-4a53-9904-6d56254fcb43): ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908: reading manifest sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908 in registry.redhat.io/cert-manager/cert-manager-operator-bundle: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:33:04.320198 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:33:04.320168 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908: reading manifest sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908 in registry.redhat.io/cert-manager/cert-manager-operator-bundle: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" Apr 17 17:33:17.998173 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:33:17.998140 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"pull\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: initializing source docker://registry.redhat.io/cert-manager/cert-manager-operator-bundle@sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908: reading manifest sha256:e4e3f81062da90a9cfcdce27085f0624952374a9aec5fbdd5796a09d24f83908 in registry.redhat.io/cert-manager/cert-manager-operator-bundle: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" Apr 17 17:33:34.510877 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:34.510842 2576 generic.go:358] "Generic (PLEG): container finished" podID="7495f516-b22d-4a53-9904-6d56254fcb43" containerID="2872f1d5b0f4a0c4cdae7344405237356c5d4821665877e4bf6144e2ca6d3ee6" exitCode=0 Apr 17 17:33:34.511344 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:34.510912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" event={"ID":"7495f516-b22d-4a53-9904-6d56254fcb43","Type":"ContainerDied","Data":"2872f1d5b0f4a0c4cdae7344405237356c5d4821665877e4bf6144e2ca6d3ee6"} Apr 17 17:33:41.539467 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:41.539432 2576 generic.go:358] "Generic (PLEG): container finished" podID="7495f516-b22d-4a53-9904-6d56254fcb43" containerID="5871aaf58789a90ec8fd1935f91c3de9fd856e445578a9f00f1a65169a7bbd33" exitCode=0 Apr 17 17:33:41.539860 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:41.539522 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" event={"ID":"7495f516-b22d-4a53-9904-6d56254fcb43","Type":"ContainerDied","Data":"5871aaf58789a90ec8fd1935f91c3de9fd856e445578a9f00f1a65169a7bbd33"} Apr 17 17:33:42.663970 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.663941 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:33:42.681400 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.681371 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-bundle\") pod \"7495f516-b22d-4a53-9904-6d56254fcb43\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " Apr 17 17:33:42.681536 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.681415 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbkgq\" (UniqueName: \"kubernetes.io/projected/7495f516-b22d-4a53-9904-6d56254fcb43-kube-api-access-rbkgq\") pod \"7495f516-b22d-4a53-9904-6d56254fcb43\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " Apr 17 17:33:42.681536 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.681433 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-util\") pod \"7495f516-b22d-4a53-9904-6d56254fcb43\" (UID: \"7495f516-b22d-4a53-9904-6d56254fcb43\") " Apr 17 17:33:42.682135 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.682106 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-bundle" (OuterVolumeSpecName: "bundle") pod "7495f516-b22d-4a53-9904-6d56254fcb43" (UID: "7495f516-b22d-4a53-9904-6d56254fcb43"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:33:42.683604 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.683573 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7495f516-b22d-4a53-9904-6d56254fcb43-kube-api-access-rbkgq" (OuterVolumeSpecName: "kube-api-access-rbkgq") pod "7495f516-b22d-4a53-9904-6d56254fcb43" (UID: "7495f516-b22d-4a53-9904-6d56254fcb43"). InnerVolumeSpecName "kube-api-access-rbkgq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:33:42.687044 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.686988 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-util" (OuterVolumeSpecName: "util") pod "7495f516-b22d-4a53-9904-6d56254fcb43" (UID: "7495f516-b22d-4a53-9904-6d56254fcb43"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:33:42.782686 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.782652 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:33:42.782686 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.782683 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbkgq\" (UniqueName: \"kubernetes.io/projected/7495f516-b22d-4a53-9904-6d56254fcb43-kube-api-access-rbkgq\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:33:42.782686 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:42.782696 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7495f516-b22d-4a53-9904-6d56254fcb43-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:33:43.547042 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:43.546985 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" event={"ID":"7495f516-b22d-4a53-9904-6d56254fcb43","Type":"ContainerDied","Data":"b6daeba58ead5983290b3f22ff0e88596d70e51c8c9b2e69abdc8c5c0c740fd4"} Apr 17 17:33:43.547042 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:43.547013 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnrxs" Apr 17 17:33:43.547251 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:43.547018 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6daeba58ead5983290b3f22ff0e88596d70e51c8c9b2e69abdc8c5c0c740fd4" Apr 17 17:33:47.661891 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.661853 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb"] Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662149 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" containerName="extract" Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662162 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" containerName="extract" Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662182 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" containerName="pull" Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662187 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" containerName="pull" Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662194 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" containerName="console" Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662200 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" containerName="console" Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662208 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" containerName="util" Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662213 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" containerName="util" Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662257 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7495f516-b22d-4a53-9904-6d56254fcb43" containerName="extract" Apr 17 17:33:47.662326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.662265 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebc8aa9c-97cc-49cb-a0bf-ad3829e46f2f" containerName="console" Apr 17 17:33:47.664801 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.664783 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" Apr 17 17:33:47.668639 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.668608 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-dxwzv\"" Apr 17 17:33:47.668639 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.668626 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:33:47.669117 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.669100 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 17:33:47.680993 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.680965 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb"] Apr 17 17:33:47.721379 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.721340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/76c37344-22fe-440c-8435-d5bff7e9bcbd-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s4rpb\" (UID: \"76c37344-22fe-440c-8435-d5bff7e9bcbd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" Apr 17 17:33:47.721550 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.721401 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxz75\" (UniqueName: \"kubernetes.io/projected/76c37344-22fe-440c-8435-d5bff7e9bcbd-kube-api-access-nxz75\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s4rpb\" (UID: \"76c37344-22fe-440c-8435-d5bff7e9bcbd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" Apr 17 17:33:47.822254 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.822219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/76c37344-22fe-440c-8435-d5bff7e9bcbd-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s4rpb\" (UID: \"76c37344-22fe-440c-8435-d5bff7e9bcbd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" Apr 17 17:33:47.822441 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.822282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxz75\" (UniqueName: \"kubernetes.io/projected/76c37344-22fe-440c-8435-d5bff7e9bcbd-kube-api-access-nxz75\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s4rpb\" (UID: \"76c37344-22fe-440c-8435-d5bff7e9bcbd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" Apr 17 17:33:47.822617 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.822597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/76c37344-22fe-440c-8435-d5bff7e9bcbd-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s4rpb\" (UID: \"76c37344-22fe-440c-8435-d5bff7e9bcbd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" Apr 17 17:33:47.834480 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.834450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxz75\" (UniqueName: \"kubernetes.io/projected/76c37344-22fe-440c-8435-d5bff7e9bcbd-kube-api-access-nxz75\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-s4rpb\" (UID: \"76c37344-22fe-440c-8435-d5bff7e9bcbd\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" Apr 17 17:33:47.974453 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:47.974350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" Apr 17 17:33:48.108492 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:48.108465 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb"] Apr 17 17:33:48.111487 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:33:48.111457 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76c37344_22fe_440c_8435_d5bff7e9bcbd.slice/crio-f485757fd0ee8eee7e86ae958f8ce2bdba46ca087b561261486b2041c3c259e2 WatchSource:0}: Error finding container f485757fd0ee8eee7e86ae958f8ce2bdba46ca087b561261486b2041c3c259e2: Status 404 returned error can't find the container with id f485757fd0ee8eee7e86ae958f8ce2bdba46ca087b561261486b2041c3c259e2 Apr 17 17:33:48.569282 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:48.569239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" event={"ID":"76c37344-22fe-440c-8435-d5bff7e9bcbd","Type":"ContainerStarted","Data":"f485757fd0ee8eee7e86ae958f8ce2bdba46ca087b561261486b2041c3c259e2"} Apr 17 17:33:50.580744 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:50.580707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" event={"ID":"76c37344-22fe-440c-8435-d5bff7e9bcbd","Type":"ContainerStarted","Data":"2d14685b8bb8c05bc12edc92b02ab317118f4d770e6c7684a3ccbd4dcb70d8dd"} Apr 17 17:33:50.604886 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:50.604827 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-s4rpb" podStartSLOduration=1.333605588 podStartE2EDuration="3.604812192s" podCreationTimestamp="2026-04-17 17:33:47 +0000 UTC" firstStartedPulling="2026-04-17 17:33:48.114158554 +0000 UTC m=+454.689506821" lastFinishedPulling="2026-04-17 17:33:50.385365156 +0000 UTC m=+456.960713425" observedRunningTime="2026-04-17 17:33:50.603678787 +0000 UTC m=+457.179027068" watchObservedRunningTime="2026-04-17 17:33:50.604812192 +0000 UTC m=+457.180160479" Apr 17 17:33:52.071012 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.070971 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb"] Apr 17 17:33:52.073488 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.073463 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.076490 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.076456 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:33:52.076620 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.076587 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:33:52.077664 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.077641 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5w67t\"" Apr 17 17:33:52.083462 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.083435 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb"] Apr 17 17:33:52.154511 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.154474 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.154691 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.154538 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.154691 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.154612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfgrc\" (UniqueName: \"kubernetes.io/projected/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-kube-api-access-kfgrc\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.255299 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.255257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.255505 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.255312 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kfgrc\" (UniqueName: \"kubernetes.io/projected/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-kube-api-access-kfgrc\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.255505 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.255367 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.255752 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.255726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.255832 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.255765 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.264196 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.264161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfgrc\" (UniqueName: \"kubernetes.io/projected/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-kube-api-access-kfgrc\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.385734 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.385648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:33:52.518966 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.518933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb"] Apr 17 17:33:52.520318 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:33:52.520288 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e9d1c75_dc44_4b32_8fe1_2e7f13437c79.slice/crio-5ff4621c839c4bf09b16c29d98916c62a2a0e62d6182a8e0f76edc6441d78a4a WatchSource:0}: Error finding container 5ff4621c839c4bf09b16c29d98916c62a2a0e62d6182a8e0f76edc6441d78a4a: Status 404 returned error can't find the container with id 5ff4621c839c4bf09b16c29d98916c62a2a0e62d6182a8e0f76edc6441d78a4a Apr 17 17:33:52.588382 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.588350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" event={"ID":"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79","Type":"ContainerStarted","Data":"33690278446ef083fef417d5d1fd25bcccbc2cd9f695f3121862979ad567f200"} Apr 17 17:33:52.588382 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:52.588384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" event={"ID":"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79","Type":"ContainerStarted","Data":"5ff4621c839c4bf09b16c29d98916c62a2a0e62d6182a8e0f76edc6441d78a4a"} Apr 17 17:33:53.595499 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:53.595399 2576 generic.go:358] "Generic (PLEG): container finished" podID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerID="33690278446ef083fef417d5d1fd25bcccbc2cd9f695f3121862979ad567f200" exitCode=0 Apr 17 17:33:53.595865 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:53.595492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" event={"ID":"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79","Type":"ContainerDied","Data":"33690278446ef083fef417d5d1fd25bcccbc2cd9f695f3121862979ad567f200"} Apr 17 17:33:54.760826 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.760788 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-bjvf9"] Apr 17 17:33:54.763776 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.763752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:33:54.766825 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.766789 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 17:33:54.766961 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.766791 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 17:33:54.768104 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.768073 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-n9zlq\"" Apr 17 17:33:54.771364 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.771338 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-bjvf9"] Apr 17 17:33:54.775610 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.775585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d7a9623-b0fa-4acd-9034-acdb79ee7335-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-bjvf9\" (UID: \"2d7a9623-b0fa-4acd-9034-acdb79ee7335\") " pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:33:54.775770 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.775746 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78kkr\" (UniqueName: \"kubernetes.io/projected/2d7a9623-b0fa-4acd-9034-acdb79ee7335-kube-api-access-78kkr\") pod \"cert-manager-webhook-597b96b99b-bjvf9\" (UID: \"2d7a9623-b0fa-4acd-9034-acdb79ee7335\") " pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:33:54.876295 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.876259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78kkr\" (UniqueName: \"kubernetes.io/projected/2d7a9623-b0fa-4acd-9034-acdb79ee7335-kube-api-access-78kkr\") pod \"cert-manager-webhook-597b96b99b-bjvf9\" (UID: \"2d7a9623-b0fa-4acd-9034-acdb79ee7335\") " pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:33:54.876471 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.876318 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d7a9623-b0fa-4acd-9034-acdb79ee7335-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-bjvf9\" (UID: \"2d7a9623-b0fa-4acd-9034-acdb79ee7335\") " pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:33:54.885933 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.885904 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d7a9623-b0fa-4acd-9034-acdb79ee7335-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-bjvf9\" (UID: \"2d7a9623-b0fa-4acd-9034-acdb79ee7335\") " pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:33:54.886113 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:54.886071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78kkr\" (UniqueName: \"kubernetes.io/projected/2d7a9623-b0fa-4acd-9034-acdb79ee7335-kube-api-access-78kkr\") pod \"cert-manager-webhook-597b96b99b-bjvf9\" (UID: \"2d7a9623-b0fa-4acd-9034-acdb79ee7335\") " pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:33:55.084684 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:55.084591 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:33:55.230961 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:55.230933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-bjvf9"] Apr 17 17:33:55.233877 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:33:55.233842 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d7a9623_b0fa_4acd_9034_acdb79ee7335.slice/crio-72df3a355495f2ff23b6fe22918016285a59c6c63434eec21caae2d9ccce42b0 WatchSource:0}: Error finding container 72df3a355495f2ff23b6fe22918016285a59c6c63434eec21caae2d9ccce42b0: Status 404 returned error can't find the container with id 72df3a355495f2ff23b6fe22918016285a59c6c63434eec21caae2d9ccce42b0 Apr 17 17:33:55.604332 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:55.604283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" event={"ID":"2d7a9623-b0fa-4acd-9034-acdb79ee7335","Type":"ContainerStarted","Data":"72df3a355495f2ff23b6fe22918016285a59c6c63434eec21caae2d9ccce42b0"} Apr 17 17:33:57.376401 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.376364 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-gdqx8"] Apr 17 17:33:57.396273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.396233 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-gdqx8"] Apr 17 17:33:57.396444 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.396349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" Apr 17 17:33:57.399459 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.399433 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-cjgj5\"" Apr 17 17:33:57.498655 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.498617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdd4t\" (UniqueName: \"kubernetes.io/projected/ca0df7b5-305b-46e6-9804-0d04bbc2a456-kube-api-access-sdd4t\") pod \"cert-manager-cainjector-8966b78d4-gdqx8\" (UID: \"ca0df7b5-305b-46e6-9804-0d04bbc2a456\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" Apr 17 17:33:57.498828 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.498663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca0df7b5-305b-46e6-9804-0d04bbc2a456-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-gdqx8\" (UID: \"ca0df7b5-305b-46e6-9804-0d04bbc2a456\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" Apr 17 17:33:57.599634 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.599596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdd4t\" (UniqueName: \"kubernetes.io/projected/ca0df7b5-305b-46e6-9804-0d04bbc2a456-kube-api-access-sdd4t\") pod \"cert-manager-cainjector-8966b78d4-gdqx8\" (UID: \"ca0df7b5-305b-46e6-9804-0d04bbc2a456\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" Apr 17 17:33:57.599634 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.599638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca0df7b5-305b-46e6-9804-0d04bbc2a456-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-gdqx8\" (UID: \"ca0df7b5-305b-46e6-9804-0d04bbc2a456\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" Apr 17 17:33:57.608621 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.608592 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca0df7b5-305b-46e6-9804-0d04bbc2a456-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-gdqx8\" (UID: \"ca0df7b5-305b-46e6-9804-0d04bbc2a456\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" Apr 17 17:33:57.608781 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.608714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdd4t\" (UniqueName: \"kubernetes.io/projected/ca0df7b5-305b-46e6-9804-0d04bbc2a456-kube-api-access-sdd4t\") pod \"cert-manager-cainjector-8966b78d4-gdqx8\" (UID: \"ca0df7b5-305b-46e6-9804-0d04bbc2a456\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" Apr 17 17:33:57.612254 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.612223 2576 generic.go:358] "Generic (PLEG): container finished" podID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerID="51af69875ba2cbb656efad3b7451448f5c1a9ffc0118ec5a70033e351ecd5d5d" exitCode=0 Apr 17 17:33:57.612377 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.612294 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" event={"ID":"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79","Type":"ContainerDied","Data":"51af69875ba2cbb656efad3b7451448f5c1a9ffc0118ec5a70033e351ecd5d5d"} Apr 17 17:33:57.708226 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.708192 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" Apr 17 17:33:57.849267 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:57.848741 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-gdqx8"] Apr 17 17:33:57.860972 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:33:57.860935 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca0df7b5_305b_46e6_9804_0d04bbc2a456.slice/crio-cd81aacd529c01d5ba9a3d1bafad6316f743721a4302926cecdd55be1f211315 WatchSource:0}: Error finding container cd81aacd529c01d5ba9a3d1bafad6316f743721a4302926cecdd55be1f211315: Status 404 returned error can't find the container with id cd81aacd529c01d5ba9a3d1bafad6316f743721a4302926cecdd55be1f211315 Apr 17 17:33:58.619310 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:58.619264 2576 generic.go:358] "Generic (PLEG): container finished" podID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerID="9bc13762a5328f469e10fec2977224fb913d7fe0a36fc6cf158d3785eca37ead" exitCode=0 Apr 17 17:33:58.619781 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:58.619367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" event={"ID":"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79","Type":"ContainerDied","Data":"9bc13762a5328f469e10fec2977224fb913d7fe0a36fc6cf158d3785eca37ead"} Apr 17 17:33:58.620645 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:33:58.620616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" event={"ID":"ca0df7b5-305b-46e6-9804-0d04bbc2a456","Type":"ContainerStarted","Data":"cd81aacd529c01d5ba9a3d1bafad6316f743721a4302926cecdd55be1f211315"} Apr 17 17:34:00.208788 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.208764 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:34:00.222756 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.222726 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfgrc\" (UniqueName: \"kubernetes.io/projected/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-kube-api-access-kfgrc\") pod \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " Apr 17 17:34:00.222891 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.222811 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-util\") pod \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " Apr 17 17:34:00.222891 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.222833 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-bundle\") pod \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\" (UID: \"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79\") " Apr 17 17:34:00.223239 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.223211 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-bundle" (OuterVolumeSpecName: "bundle") pod "4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" (UID: "4e9d1c75-dc44-4b32-8fe1-2e7f13437c79"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:34:00.224774 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.224749 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-kube-api-access-kfgrc" (OuterVolumeSpecName: "kube-api-access-kfgrc") pod "4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" (UID: "4e9d1c75-dc44-4b32-8fe1-2e7f13437c79"). InnerVolumeSpecName "kube-api-access-kfgrc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:00.227651 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.227623 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-util" (OuterVolumeSpecName: "util") pod "4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" (UID: "4e9d1c75-dc44-4b32-8fe1-2e7f13437c79"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:34:00.323972 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.323948 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kfgrc\" (UniqueName: \"kubernetes.io/projected/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-kube-api-access-kfgrc\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:00.323972 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.323972 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:00.324184 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.323982 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e9d1c75-dc44-4b32-8fe1-2e7f13437c79-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:00.628962 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.628923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" event={"ID":"2d7a9623-b0fa-4acd-9034-acdb79ee7335","Type":"ContainerStarted","Data":"5cb9a81f64f8b6a13bdf49bb9a17a3af55d6d519820d4b351f30fcb615be0871"} Apr 17 17:34:00.629328 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.628987 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:34:00.630479 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.630451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" event={"ID":"ca0df7b5-305b-46e6-9804-0d04bbc2a456","Type":"ContainerStarted","Data":"0562b8d0a6b6f957b498323a1365b9fc9f5ddd26c1309567199a262f9c9be347"} Apr 17 17:34:00.632101 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.632072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" event={"ID":"4e9d1c75-dc44-4b32-8fe1-2e7f13437c79","Type":"ContainerDied","Data":"5ff4621c839c4bf09b16c29d98916c62a2a0e62d6182a8e0f76edc6441d78a4a"} Apr 17 17:34:00.632238 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.632104 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ff4621c839c4bf09b16c29d98916c62a2a0e62d6182a8e0f76edc6441d78a4a" Apr 17 17:34:00.632238 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.632132 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fhdclb" Apr 17 17:34:00.647982 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.647933 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" podStartSLOduration=1.6143680200000001 podStartE2EDuration="6.647918757s" podCreationTimestamp="2026-04-17 17:33:54 +0000 UTC" firstStartedPulling="2026-04-17 17:33:55.236092742 +0000 UTC m=+461.811441023" lastFinishedPulling="2026-04-17 17:34:00.269643494 +0000 UTC m=+466.844991760" observedRunningTime="2026-04-17 17:34:00.646921937 +0000 UTC m=+467.222270225" watchObservedRunningTime="2026-04-17 17:34:00.647918757 +0000 UTC m=+467.223267045" Apr 17 17:34:00.666543 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:00.666484 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-gdqx8" podStartSLOduration=1.2524632549999999 podStartE2EDuration="3.666468814s" podCreationTimestamp="2026-04-17 17:33:57 +0000 UTC" firstStartedPulling="2026-04-17 17:33:57.862980302 +0000 UTC m=+464.438328568" lastFinishedPulling="2026-04-17 17:34:00.276985854 +0000 UTC m=+466.852334127" observedRunningTime="2026-04-17 17:34:00.665338312 +0000 UTC m=+467.240686602" watchObservedRunningTime="2026-04-17 17:34:00.666468814 +0000 UTC m=+467.241817103" Apr 17 17:34:06.638259 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:06.638227 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-bjvf9" Apr 17 17:34:07.260587 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.260547 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb"] Apr 17 17:34:07.260879 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.260866 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerName="util" Apr 17 17:34:07.260923 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.260880 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerName="util" Apr 17 17:34:07.260923 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.260892 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerName="extract" Apr 17 17:34:07.260923 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.260898 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerName="extract" Apr 17 17:34:07.260923 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.260916 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerName="pull" Apr 17 17:34:07.260923 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.260922 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerName="pull" Apr 17 17:34:07.261125 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.260975 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e9d1c75-dc44-4b32-8fe1-2e7f13437c79" containerName="extract" Apr 17 17:34:07.263940 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.263922 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" Apr 17 17:34:07.266782 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.266759 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:34:07.266928 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.266786 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 17:34:07.268015 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.267997 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-xdwxj\"" Apr 17 17:34:07.271820 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.271788 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb"] Apr 17 17:34:07.383113 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.383066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1eb573e0-c340-400e-a9ee-c6bc80eddb58-tmp\") pod \"openshift-lws-operator-bfc7f696d-g7jcb\" (UID: \"1eb573e0-c340-400e-a9ee-c6bc80eddb58\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" Apr 17 17:34:07.383113 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.383110 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqnm5\" (UniqueName: \"kubernetes.io/projected/1eb573e0-c340-400e-a9ee-c6bc80eddb58-kube-api-access-qqnm5\") pod \"openshift-lws-operator-bfc7f696d-g7jcb\" (UID: \"1eb573e0-c340-400e-a9ee-c6bc80eddb58\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" Apr 17 17:34:07.483555 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.483518 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1eb573e0-c340-400e-a9ee-c6bc80eddb58-tmp\") pod \"openshift-lws-operator-bfc7f696d-g7jcb\" (UID: \"1eb573e0-c340-400e-a9ee-c6bc80eddb58\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" Apr 17 17:34:07.483555 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.483555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqnm5\" (UniqueName: \"kubernetes.io/projected/1eb573e0-c340-400e-a9ee-c6bc80eddb58-kube-api-access-qqnm5\") pod \"openshift-lws-operator-bfc7f696d-g7jcb\" (UID: \"1eb573e0-c340-400e-a9ee-c6bc80eddb58\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" Apr 17 17:34:07.483959 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.483932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1eb573e0-c340-400e-a9ee-c6bc80eddb58-tmp\") pod \"openshift-lws-operator-bfc7f696d-g7jcb\" (UID: \"1eb573e0-c340-400e-a9ee-c6bc80eddb58\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" Apr 17 17:34:07.497201 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.497167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqnm5\" (UniqueName: \"kubernetes.io/projected/1eb573e0-c340-400e-a9ee-c6bc80eddb58-kube-api-access-qqnm5\") pod \"openshift-lws-operator-bfc7f696d-g7jcb\" (UID: \"1eb573e0-c340-400e-a9ee-c6bc80eddb58\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" Apr 17 17:34:07.574317 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.574225 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" Apr 17 17:34:07.703104 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:07.703078 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb"] Apr 17 17:34:07.705633 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:34:07.705602 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1eb573e0_c340_400e_a9ee_c6bc80eddb58.slice/crio-481987880ebda4d31558008e98dc1bc0bf07f542bdb3b532daaf11c03f0db353 WatchSource:0}: Error finding container 481987880ebda4d31558008e98dc1bc0bf07f542bdb3b532daaf11c03f0db353: Status 404 returned error can't find the container with id 481987880ebda4d31558008e98dc1bc0bf07f542bdb3b532daaf11c03f0db353 Apr 17 17:34:08.661550 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:08.661516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" event={"ID":"1eb573e0-c340-400e-a9ee-c6bc80eddb58","Type":"ContainerStarted","Data":"481987880ebda4d31558008e98dc1bc0bf07f542bdb3b532daaf11c03f0db353"} Apr 17 17:34:11.674924 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:11.674879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" event={"ID":"1eb573e0-c340-400e-a9ee-c6bc80eddb58","Type":"ContainerStarted","Data":"46207ad5fe6eadad8d22307a747846141e6008f6248ef3bc16871af2aeac64ef"} Apr 17 17:34:11.693548 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:11.693495 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-g7jcb" podStartSLOduration=1.246252839 podStartE2EDuration="4.693479745s" podCreationTimestamp="2026-04-17 17:34:07 +0000 UTC" firstStartedPulling="2026-04-17 17:34:07.707469803 +0000 UTC m=+474.282818068" lastFinishedPulling="2026-04-17 17:34:11.154696703 +0000 UTC m=+477.730044974" observedRunningTime="2026-04-17 17:34:11.691921969 +0000 UTC m=+478.267270260" watchObservedRunningTime="2026-04-17 17:34:11.693479745 +0000 UTC m=+478.268828032" Apr 17 17:34:13.786311 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.786268 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg"] Apr 17 17:34:13.789909 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.789889 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:13.792853 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.792827 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:34:13.793993 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.793973 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:34:13.794117 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.793989 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5w67t\"" Apr 17 17:34:13.797457 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.797428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg"] Apr 17 17:34:13.841358 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.841309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:13.841358 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.841350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s48xx\" (UniqueName: \"kubernetes.io/projected/e4def3b6-64a7-4151-aeab-b0d073a5b634-kube-api-access-s48xx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:13.841644 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.841464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:13.942316 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.942279 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:13.942482 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.942342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:13.942482 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.942362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s48xx\" (UniqueName: \"kubernetes.io/projected/e4def3b6-64a7-4151-aeab-b0d073a5b634-kube-api-access-s48xx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:13.942657 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.942638 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:13.942722 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.942701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:13.951483 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.951460 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:34:13.962382 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.962352 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:34:13.973273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:13.973234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s48xx\" (UniqueName: \"kubernetes.io/projected/e4def3b6-64a7-4151-aeab-b0d073a5b634-kube-api-access-s48xx\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:14.102765 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:14.102674 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5w67t\"" Apr 17 17:34:14.109951 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:14.109922 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:14.235217 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:14.235191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg"] Apr 17 17:34:14.237249 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:34:14.237218 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4def3b6_64a7_4151_aeab_b0d073a5b634.slice/crio-a38c9c9addf1226bcfbfe5cea397b739787b85f6f6cd2b3c4bfa4f0508b7e61b WatchSource:0}: Error finding container a38c9c9addf1226bcfbfe5cea397b739787b85f6f6cd2b3c4bfa4f0508b7e61b: Status 404 returned error can't find the container with id a38c9c9addf1226bcfbfe5cea397b739787b85f6f6cd2b3c4bfa4f0508b7e61b Apr 17 17:34:14.686098 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:14.686059 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerID="467d1d8577368ceb031c3639b1ef38b244a5478b5bb91edcef1f9d6effc8f944" exitCode=0 Apr 17 17:34:14.686262 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:14.686166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" event={"ID":"e4def3b6-64a7-4151-aeab-b0d073a5b634","Type":"ContainerDied","Data":"467d1d8577368ceb031c3639b1ef38b244a5478b5bb91edcef1f9d6effc8f944"} Apr 17 17:34:14.686262 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:14.686187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" event={"ID":"e4def3b6-64a7-4151-aeab-b0d073a5b634","Type":"ContainerStarted","Data":"a38c9c9addf1226bcfbfe5cea397b739787b85f6f6cd2b3c4bfa4f0508b7e61b"} Apr 17 17:34:15.691415 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:15.691323 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerID="aa67927ab85fa6aa60511051eba28cb5f2f8fc9e42fda071d47fa671be63ac04" exitCode=0 Apr 17 17:34:15.691415 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:15.691386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" event={"ID":"e4def3b6-64a7-4151-aeab-b0d073a5b634","Type":"ContainerDied","Data":"aa67927ab85fa6aa60511051eba28cb5f2f8fc9e42fda071d47fa671be63ac04"} Apr 17 17:34:16.697257 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:16.697223 2576 generic.go:358] "Generic (PLEG): container finished" podID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerID="53757d02af606d508fc0858a1852b59ac57be5387e957e25c8a59a2dd00c6a11" exitCode=0 Apr 17 17:34:16.697647 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:16.697308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" event={"ID":"e4def3b6-64a7-4151-aeab-b0d073a5b634","Type":"ContainerDied","Data":"53757d02af606d508fc0858a1852b59ac57be5387e957e25c8a59a2dd00c6a11"} Apr 17 17:34:17.824453 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:17.824423 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:17.976185 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:17.976093 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s48xx\" (UniqueName: \"kubernetes.io/projected/e4def3b6-64a7-4151-aeab-b0d073a5b634-kube-api-access-s48xx\") pod \"e4def3b6-64a7-4151-aeab-b0d073a5b634\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " Apr 17 17:34:17.976185 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:17.976143 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-bundle\") pod \"e4def3b6-64a7-4151-aeab-b0d073a5b634\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " Apr 17 17:34:17.976391 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:17.976205 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-util\") pod \"e4def3b6-64a7-4151-aeab-b0d073a5b634\" (UID: \"e4def3b6-64a7-4151-aeab-b0d073a5b634\") " Apr 17 17:34:17.976906 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:17.976867 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-bundle" (OuterVolumeSpecName: "bundle") pod "e4def3b6-64a7-4151-aeab-b0d073a5b634" (UID: "e4def3b6-64a7-4151-aeab-b0d073a5b634"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:34:17.978282 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:17.978253 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4def3b6-64a7-4151-aeab-b0d073a5b634-kube-api-access-s48xx" (OuterVolumeSpecName: "kube-api-access-s48xx") pod "e4def3b6-64a7-4151-aeab-b0d073a5b634" (UID: "e4def3b6-64a7-4151-aeab-b0d073a5b634"). InnerVolumeSpecName "kube-api-access-s48xx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:17.981750 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:17.981719 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-util" (OuterVolumeSpecName: "util") pod "e4def3b6-64a7-4151-aeab-b0d073a5b634" (UID: "e4def3b6-64a7-4151-aeab-b0d073a5b634"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:34:18.076913 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:18.076875 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s48xx\" (UniqueName: \"kubernetes.io/projected/e4def3b6-64a7-4151-aeab-b0d073a5b634-kube-api-access-s48xx\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:18.076913 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:18.076906 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:18.076913 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:18.076916 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e4def3b6-64a7-4151-aeab-b0d073a5b634-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:18.706250 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:18.706223 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" Apr 17 17:34:18.706250 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:18.706235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5nf6zg" event={"ID":"e4def3b6-64a7-4151-aeab-b0d073a5b634","Type":"ContainerDied","Data":"a38c9c9addf1226bcfbfe5cea397b739787b85f6f6cd2b3c4bfa4f0508b7e61b"} Apr 17 17:34:18.706455 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:18.706264 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a38c9c9addf1226bcfbfe5cea397b739787b85f6f6cd2b3c4bfa4f0508b7e61b" Apr 17 17:34:22.423442 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.423404 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf"] Apr 17 17:34:22.423818 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.423695 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerName="util" Apr 17 17:34:22.423818 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.423706 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerName="util" Apr 17 17:34:22.423818 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.423724 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerName="extract" Apr 17 17:34:22.423818 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.423730 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerName="extract" Apr 17 17:34:22.423818 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.423737 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerName="pull" Apr 17 17:34:22.423818 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.423744 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerName="pull" Apr 17 17:34:22.423818 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.423800 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4def3b6-64a7-4151-aeab-b0d073a5b634" containerName="extract" Apr 17 17:34:22.426388 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.426367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.429560 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.429531 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 17:34:22.431132 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.431109 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 17:34:22.431364 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.431350 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-9985d\"" Apr 17 17:34:22.431417 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.431401 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 17:34:22.442667 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.442634 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf"] Apr 17 17:34:22.520162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.520118 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b68fd41-28f3-4939-80d5-5fc14f95771b-metrics-cert\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.520162 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.520161 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b68fd41-28f3-4939-80d5-5fc14f95771b-cert\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.520420 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.520192 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6mq\" (UniqueName: \"kubernetes.io/projected/8b68fd41-28f3-4939-80d5-5fc14f95771b-kube-api-access-zv6mq\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.520420 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.520262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8b68fd41-28f3-4939-80d5-5fc14f95771b-manager-config\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.620934 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.620892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6mq\" (UniqueName: \"kubernetes.io/projected/8b68fd41-28f3-4939-80d5-5fc14f95771b-kube-api-access-zv6mq\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.620934 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.620938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8b68fd41-28f3-4939-80d5-5fc14f95771b-manager-config\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.621176 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.621045 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b68fd41-28f3-4939-80d5-5fc14f95771b-metrics-cert\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.621176 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.621081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b68fd41-28f3-4939-80d5-5fc14f95771b-cert\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.621772 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.621748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8b68fd41-28f3-4939-80d5-5fc14f95771b-manager-config\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.623657 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.623630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8b68fd41-28f3-4939-80d5-5fc14f95771b-cert\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.623787 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.623729 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8b68fd41-28f3-4939-80d5-5fc14f95771b-metrics-cert\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.630676 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.630650 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6mq\" (UniqueName: \"kubernetes.io/projected/8b68fd41-28f3-4939-80d5-5fc14f95771b-kube-api-access-zv6mq\") pod \"lws-controller-manager-6f45766749-7dhmf\" (UID: \"8b68fd41-28f3-4939-80d5-5fc14f95771b\") " pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.737255 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.737159 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:22.872865 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:22.872807 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf"] Apr 17 17:34:22.875095 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:34:22.875065 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b68fd41_28f3_4939_80d5_5fc14f95771b.slice/crio-2cb7ca69f572637d6d393173c368ae700b17650a11810e20d2fd51c361288bc0 WatchSource:0}: Error finding container 2cb7ca69f572637d6d393173c368ae700b17650a11810e20d2fd51c361288bc0: Status 404 returned error can't find the container with id 2cb7ca69f572637d6d393173c368ae700b17650a11810e20d2fd51c361288bc0 Apr 17 17:34:23.722488 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:23.722451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" event={"ID":"8b68fd41-28f3-4939-80d5-5fc14f95771b","Type":"ContainerStarted","Data":"2cb7ca69f572637d6d393173c368ae700b17650a11810e20d2fd51c361288bc0"} Apr 17 17:34:25.731086 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:25.731044 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" event={"ID":"8b68fd41-28f3-4939-80d5-5fc14f95771b","Type":"ContainerStarted","Data":"95b80e9556a567d2b6dfb131da6cd92f0205f30e81f55f7397eb7c08e2aa01bf"} Apr 17 17:34:25.731481 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:25.731162 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:25.750339 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:25.750276 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" podStartSLOduration=1.408851684 podStartE2EDuration="3.75025581s" podCreationTimestamp="2026-04-17 17:34:22 +0000 UTC" firstStartedPulling="2026-04-17 17:34:22.876902872 +0000 UTC m=+489.452251139" lastFinishedPulling="2026-04-17 17:34:25.218306981 +0000 UTC m=+491.793655265" observedRunningTime="2026-04-17 17:34:25.7495248 +0000 UTC m=+492.324873090" watchObservedRunningTime="2026-04-17 17:34:25.75025581 +0000 UTC m=+492.325604104" Apr 17 17:34:27.930751 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:27.930714 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w"] Apr 17 17:34:27.934531 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:27.934506 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:27.937103 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:27.937077 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 17:34:27.937470 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:27.937444 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 17:34:27.937470 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:27.937456 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pdn5v\"" Apr 17 17:34:27.937687 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:27.937487 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 17:34:27.937746 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:27.937736 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 17:34:27.947912 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:27.947886 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w"] Apr 17 17:34:28.070749 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.070712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k89rr\" (UniqueName: \"kubernetes.io/projected/db0b2ad3-0434-48fc-b7d7-385e8237ba41-kube-api-access-k89rr\") pod \"opendatahub-operator-controller-manager-77fb85d776-4gb5w\" (UID: \"db0b2ad3-0434-48fc-b7d7-385e8237ba41\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.070949 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.070828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db0b2ad3-0434-48fc-b7d7-385e8237ba41-webhook-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-4gb5w\" (UID: \"db0b2ad3-0434-48fc-b7d7-385e8237ba41\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.070949 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.070911 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db0b2ad3-0434-48fc-b7d7-385e8237ba41-apiservice-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-4gb5w\" (UID: \"db0b2ad3-0434-48fc-b7d7-385e8237ba41\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.172189 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.172130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k89rr\" (UniqueName: \"kubernetes.io/projected/db0b2ad3-0434-48fc-b7d7-385e8237ba41-kube-api-access-k89rr\") pod \"opendatahub-operator-controller-manager-77fb85d776-4gb5w\" (UID: \"db0b2ad3-0434-48fc-b7d7-385e8237ba41\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.172511 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.172243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db0b2ad3-0434-48fc-b7d7-385e8237ba41-webhook-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-4gb5w\" (UID: \"db0b2ad3-0434-48fc-b7d7-385e8237ba41\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.172511 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.172344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db0b2ad3-0434-48fc-b7d7-385e8237ba41-apiservice-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-4gb5w\" (UID: \"db0b2ad3-0434-48fc-b7d7-385e8237ba41\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.175152 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.175125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/db0b2ad3-0434-48fc-b7d7-385e8237ba41-webhook-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-4gb5w\" (UID: \"db0b2ad3-0434-48fc-b7d7-385e8237ba41\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.175363 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.175340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/db0b2ad3-0434-48fc-b7d7-385e8237ba41-apiservice-cert\") pod \"opendatahub-operator-controller-manager-77fb85d776-4gb5w\" (UID: \"db0b2ad3-0434-48fc-b7d7-385e8237ba41\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.184395 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.184324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k89rr\" (UniqueName: \"kubernetes.io/projected/db0b2ad3-0434-48fc-b7d7-385e8237ba41-kube-api-access-k89rr\") pod \"opendatahub-operator-controller-manager-77fb85d776-4gb5w\" (UID: \"db0b2ad3-0434-48fc-b7d7-385e8237ba41\") " pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.247441 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.247401 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:28.384668 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.384636 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w"] Apr 17 17:34:28.386772 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:34:28.386748 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb0b2ad3_0434_48fc_b7d7_385e8237ba41.slice/crio-6d86ea18870c429f2f16b7b346abdc4192db9fbd2c8d7bfe8765acaf82daba7c WatchSource:0}: Error finding container 6d86ea18870c429f2f16b7b346abdc4192db9fbd2c8d7bfe8765acaf82daba7c: Status 404 returned error can't find the container with id 6d86ea18870c429f2f16b7b346abdc4192db9fbd2c8d7bfe8765acaf82daba7c Apr 17 17:34:28.407645 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.407579 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt"] Apr 17 17:34:28.410879 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.410857 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.417596 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.417573 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:34:28.417596 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.417596 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5w67t\"" Apr 17 17:34:28.417822 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.417596 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:34:28.423143 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.423118 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt"] Apr 17 17:34:28.576344 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.576311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.576522 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.576383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsqx8\" (UniqueName: \"kubernetes.io/projected/f475ea8c-35ca-41d7-a9ae-610b68a05fef-kube-api-access-nsqx8\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.576522 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.576429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.677170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.677135 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.677369 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.677186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.677369 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.677234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsqx8\" (UniqueName: \"kubernetes.io/projected/f475ea8c-35ca-41d7-a9ae-610b68a05fef-kube-api-access-nsqx8\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.677603 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.677579 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.677603 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.677598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.686471 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.686444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsqx8\" (UniqueName: \"kubernetes.io/projected/f475ea8c-35ca-41d7-a9ae-610b68a05fef-kube-api-access-nsqx8\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.721444 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.721415 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:28.744036 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.743986 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" event={"ID":"db0b2ad3-0434-48fc-b7d7-385e8237ba41","Type":"ContainerStarted","Data":"6d86ea18870c429f2f16b7b346abdc4192db9fbd2c8d7bfe8765acaf82daba7c"} Apr 17 17:34:28.858775 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:28.858732 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt"] Apr 17 17:34:28.861317 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:34:28.861288 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf475ea8c_35ca_41d7_a9ae_610b68a05fef.slice/crio-c9e1df27b8a095f308dac744a163594898306d1da53626cf7649c71040a99fe8 WatchSource:0}: Error finding container c9e1df27b8a095f308dac744a163594898306d1da53626cf7649c71040a99fe8: Status 404 returned error can't find the container with id c9e1df27b8a095f308dac744a163594898306d1da53626cf7649c71040a99fe8 Apr 17 17:34:29.749477 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:29.749439 2576 generic.go:358] "Generic (PLEG): container finished" podID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerID="ae6607ae69f9cda4e8f73290292b13b2f0e5a2f41dbdeb5eaeaadaaf9426deb0" exitCode=0 Apr 17 17:34:29.749958 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:29.749545 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" event={"ID":"f475ea8c-35ca-41d7-a9ae-610b68a05fef","Type":"ContainerDied","Data":"ae6607ae69f9cda4e8f73290292b13b2f0e5a2f41dbdeb5eaeaadaaf9426deb0"} Apr 17 17:34:29.749958 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:29.749587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" event={"ID":"f475ea8c-35ca-41d7-a9ae-610b68a05fef","Type":"ContainerStarted","Data":"c9e1df27b8a095f308dac744a163594898306d1da53626cf7649c71040a99fe8"} Apr 17 17:34:31.758170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:31.758126 2576 generic.go:358] "Generic (PLEG): container finished" podID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerID="80d0d0bc1f80b30c6300f42f067790e870a633601d0a4a10de6e71fac025f71b" exitCode=0 Apr 17 17:34:31.758649 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:31.758197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" event={"ID":"f475ea8c-35ca-41d7-a9ae-610b68a05fef","Type":"ContainerDied","Data":"80d0d0bc1f80b30c6300f42f067790e870a633601d0a4a10de6e71fac025f71b"} Apr 17 17:34:31.759958 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:31.759903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" event={"ID":"db0b2ad3-0434-48fc-b7d7-385e8237ba41","Type":"ContainerStarted","Data":"b7e6e7df74cbabb23e0422a0a31e25733901aab0fff430e45dd9a865662c2076"} Apr 17 17:34:31.760149 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:31.760079 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:31.797260 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:31.797205 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" podStartSLOduration=2.364095038 podStartE2EDuration="4.79718886s" podCreationTimestamp="2026-04-17 17:34:27 +0000 UTC" firstStartedPulling="2026-04-17 17:34:28.388610802 +0000 UTC m=+494.963959068" lastFinishedPulling="2026-04-17 17:34:30.821704624 +0000 UTC m=+497.397052890" observedRunningTime="2026-04-17 17:34:31.79563978 +0000 UTC m=+498.370988083" watchObservedRunningTime="2026-04-17 17:34:31.79718886 +0000 UTC m=+498.372537186" Apr 17 17:34:32.765908 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:32.765876 2576 generic.go:358] "Generic (PLEG): container finished" podID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerID="de0e664f2886eb84066ae699e3d32b4ee4d79f748e5e2aac907f23ea519205d9" exitCode=0 Apr 17 17:34:32.766282 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:32.765951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" event={"ID":"f475ea8c-35ca-41d7-a9ae-610b68a05fef","Type":"ContainerDied","Data":"de0e664f2886eb84066ae699e3d32b4ee4d79f748e5e2aac907f23ea519205d9"} Apr 17 17:34:33.898205 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:33.898178 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:34.021772 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.021677 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-bundle\") pod \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " Apr 17 17:34:34.021772 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.021743 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsqx8\" (UniqueName: \"kubernetes.io/projected/f475ea8c-35ca-41d7-a9ae-610b68a05fef-kube-api-access-nsqx8\") pod \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " Apr 17 17:34:34.021958 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.021843 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-util\") pod \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\" (UID: \"f475ea8c-35ca-41d7-a9ae-610b68a05fef\") " Apr 17 17:34:34.022509 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.022480 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-bundle" (OuterVolumeSpecName: "bundle") pod "f475ea8c-35ca-41d7-a9ae-610b68a05fef" (UID: "f475ea8c-35ca-41d7-a9ae-610b68a05fef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:34:34.023864 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.023831 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f475ea8c-35ca-41d7-a9ae-610b68a05fef-kube-api-access-nsqx8" (OuterVolumeSpecName: "kube-api-access-nsqx8") pod "f475ea8c-35ca-41d7-a9ae-610b68a05fef" (UID: "f475ea8c-35ca-41d7-a9ae-610b68a05fef"). InnerVolumeSpecName "kube-api-access-nsqx8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:34.027210 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.027172 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-util" (OuterVolumeSpecName: "util") pod "f475ea8c-35ca-41d7-a9ae-610b68a05fef" (UID: "f475ea8c-35ca-41d7-a9ae-610b68a05fef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:34:34.123230 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.123190 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:34.123230 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.123222 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f475ea8c-35ca-41d7-a9ae-610b68a05fef-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:34.123230 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.123232 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nsqx8\" (UniqueName: \"kubernetes.io/projected/f475ea8c-35ca-41d7-a9ae-610b68a05fef-kube-api-access-nsqx8\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:34.774768 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.774731 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" event={"ID":"f475ea8c-35ca-41d7-a9ae-610b68a05fef","Type":"ContainerDied","Data":"c9e1df27b8a095f308dac744a163594898306d1da53626cf7649c71040a99fe8"} Apr 17 17:34:34.774768 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.774773 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9e1df27b8a095f308dac744a163594898306d1da53626cf7649c71040a99fe8" Apr 17 17:34:34.774980 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:34.774799 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9xlwtt" Apr 17 17:34:36.737934 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:36.737903 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6f45766749-7dhmf" Apr 17 17:34:42.769043 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:42.768994 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-77fb85d776-4gb5w" Apr 17 17:34:46.192861 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.192821 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-598f578945-4x886"] Apr 17 17:34:46.193273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.193171 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerName="pull" Apr 17 17:34:46.193273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.193186 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerName="pull" Apr 17 17:34:46.193273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.193199 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerName="util" Apr 17 17:34:46.193273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.193204 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerName="util" Apr 17 17:34:46.193273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.193211 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerName="extract" Apr 17 17:34:46.193273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.193216 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerName="extract" Apr 17 17:34:46.193273 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.193268 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f475ea8c-35ca-41d7-a9ae-610b68a05fef" containerName="extract" Apr 17 17:34:46.197751 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.197731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.201113 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.201090 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 17:34:46.202804 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.202771 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-k5jkm\"" Apr 17 17:34:46.202937 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.202877 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 17:34:46.202937 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.202888 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:34:46.203081 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.203013 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:34:46.205437 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.205147 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-598f578945-4x886"] Apr 17 17:34:46.214160 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.214131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/341f200d-f57e-4466-b96c-0b3d9a8d03ff-tls-certs\") pod \"kube-auth-proxy-598f578945-4x886\" (UID: \"341f200d-f57e-4466-b96c-0b3d9a8d03ff\") " pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.214332 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.214195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/341f200d-f57e-4466-b96c-0b3d9a8d03ff-tmp\") pod \"kube-auth-proxy-598f578945-4x886\" (UID: \"341f200d-f57e-4466-b96c-0b3d9a8d03ff\") " pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.214332 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.214213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzqzh\" (UniqueName: \"kubernetes.io/projected/341f200d-f57e-4466-b96c-0b3d9a8d03ff-kube-api-access-gzqzh\") pod \"kube-auth-proxy-598f578945-4x886\" (UID: \"341f200d-f57e-4466-b96c-0b3d9a8d03ff\") " pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.314644 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.314591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/341f200d-f57e-4466-b96c-0b3d9a8d03ff-tmp\") pod \"kube-auth-proxy-598f578945-4x886\" (UID: \"341f200d-f57e-4466-b96c-0b3d9a8d03ff\") " pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.314644 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.314646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzqzh\" (UniqueName: \"kubernetes.io/projected/341f200d-f57e-4466-b96c-0b3d9a8d03ff-kube-api-access-gzqzh\") pod \"kube-auth-proxy-598f578945-4x886\" (UID: \"341f200d-f57e-4466-b96c-0b3d9a8d03ff\") " pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.314870 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.314673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/341f200d-f57e-4466-b96c-0b3d9a8d03ff-tls-certs\") pod \"kube-auth-proxy-598f578945-4x886\" (UID: \"341f200d-f57e-4466-b96c-0b3d9a8d03ff\") " pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.317006 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.316985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/341f200d-f57e-4466-b96c-0b3d9a8d03ff-tmp\") pod \"kube-auth-proxy-598f578945-4x886\" (UID: \"341f200d-f57e-4466-b96c-0b3d9a8d03ff\") " pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.317250 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.317228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/341f200d-f57e-4466-b96c-0b3d9a8d03ff-tls-certs\") pod \"kube-auth-proxy-598f578945-4x886\" (UID: \"341f200d-f57e-4466-b96c-0b3d9a8d03ff\") " pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.322337 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.322310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzqzh\" (UniqueName: \"kubernetes.io/projected/341f200d-f57e-4466-b96c-0b3d9a8d03ff-kube-api-access-gzqzh\") pod \"kube-auth-proxy-598f578945-4x886\" (UID: \"341f200d-f57e-4466-b96c-0b3d9a8d03ff\") " pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.508922 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.508890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" Apr 17 17:34:46.632800 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.632771 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-598f578945-4x886"] Apr 17 17:34:46.635467 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:34:46.635439 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod341f200d_f57e_4466_b96c_0b3d9a8d03ff.slice/crio-0949a6be35b09345451087e22c54a7ef114e9e044fef972e4505548b44c85cff WatchSource:0}: Error finding container 0949a6be35b09345451087e22c54a7ef114e9e044fef972e4505548b44c85cff: Status 404 returned error can't find the container with id 0949a6be35b09345451087e22c54a7ef114e9e044fef972e4505548b44c85cff Apr 17 17:34:46.819186 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.819105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" event={"ID":"341f200d-f57e-4466-b96c-0b3d9a8d03ff","Type":"ContainerStarted","Data":"0949a6be35b09345451087e22c54a7ef114e9e044fef972e4505548b44c85cff"} Apr 17 17:34:47.000142 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:46.998289 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm"] Apr 17 17:34:47.004037 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.003994 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.012104 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.012074 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5w67t\"" Apr 17 17:34:47.012258 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.012180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:34:47.012258 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.012180 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:34:47.013916 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.013892 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm"] Apr 17 17:34:47.022840 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.022808 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tf8m\" (UniqueName: \"kubernetes.io/projected/074b93c3-4396-48d8-adfe-9ad1672009e1-kube-api-access-4tf8m\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.022991 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.022864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.022991 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.022890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.123751 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.123662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.123751 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.123700 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.123981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.123771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tf8m\" (UniqueName: \"kubernetes.io/projected/074b93c3-4396-48d8-adfe-9ad1672009e1-kube-api-access-4tf8m\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.124216 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.124183 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.124508 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.124487 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.141264 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.141231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tf8m\" (UniqueName: \"kubernetes.io/projected/074b93c3-4396-48d8-adfe-9ad1672009e1-kube-api-access-4tf8m\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.314852 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.314814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:47.479956 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.479927 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm"] Apr 17 17:34:47.482686 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:34:47.482652 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod074b93c3_4396_48d8_adfe_9ad1672009e1.slice/crio-6d9b61212bece56ab498f6ef7ea4f82da862a603f81ae33db37ca092873707d5 WatchSource:0}: Error finding container 6d9b61212bece56ab498f6ef7ea4f82da862a603f81ae33db37ca092873707d5: Status 404 returned error can't find the container with id 6d9b61212bece56ab498f6ef7ea4f82da862a603f81ae33db37ca092873707d5 Apr 17 17:34:47.825899 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.825846 2576 generic.go:358] "Generic (PLEG): container finished" podID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerID="aec4d995c40e7f7e70c530eb6c0ce45804cfe057fa2e077501e64e3ad7716984" exitCode=0 Apr 17 17:34:47.826123 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.825956 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" event={"ID":"074b93c3-4396-48d8-adfe-9ad1672009e1","Type":"ContainerDied","Data":"aec4d995c40e7f7e70c530eb6c0ce45804cfe057fa2e077501e64e3ad7716984"} Apr 17 17:34:47.826123 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:47.825988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" event={"ID":"074b93c3-4396-48d8-adfe-9ad1672009e1","Type":"ContainerStarted","Data":"6d9b61212bece56ab498f6ef7ea4f82da862a603f81ae33db37ca092873707d5"} Apr 17 17:34:48.831998 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:48.831962 2576 generic.go:358] "Generic (PLEG): container finished" podID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerID="85ad4b83729537b10cb3e3ce450b4452629b013a1e8ce6527d6bedd4d45ce85d" exitCode=0 Apr 17 17:34:48.832384 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:48.832066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" event={"ID":"074b93c3-4396-48d8-adfe-9ad1672009e1","Type":"ContainerDied","Data":"85ad4b83729537b10cb3e3ce450b4452629b013a1e8ce6527d6bedd4d45ce85d"} Apr 17 17:34:49.838179 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:49.838141 2576 generic.go:358] "Generic (PLEG): container finished" podID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerID="47e7170b85f685aab034b34e4fa39dcf9220e7c59abecaf4d2528dfa566c205c" exitCode=0 Apr 17 17:34:49.838569 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:49.838225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" event={"ID":"074b93c3-4396-48d8-adfe-9ad1672009e1","Type":"ContainerDied","Data":"47e7170b85f685aab034b34e4fa39dcf9220e7c59abecaf4d2528dfa566c205c"} Apr 17 17:34:50.844421 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:50.844381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" event={"ID":"341f200d-f57e-4466-b96c-0b3d9a8d03ff","Type":"ContainerStarted","Data":"2276ca4ec513621c5b7437dc7c996ded6f5741083e962ce82e19646b16c67f2a"} Apr 17 17:34:50.864603 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:50.864543 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-598f578945-4x886" podStartSLOduration=1.548855265 podStartE2EDuration="4.864527962s" podCreationTimestamp="2026-04-17 17:34:46 +0000 UTC" firstStartedPulling="2026-04-17 17:34:46.63727146 +0000 UTC m=+513.212619725" lastFinishedPulling="2026-04-17 17:34:49.952944135 +0000 UTC m=+516.528292422" observedRunningTime="2026-04-17 17:34:50.862190256 +0000 UTC m=+517.437538545" watchObservedRunningTime="2026-04-17 17:34:50.864527962 +0000 UTC m=+517.439876249" Apr 17 17:34:50.971942 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:50.971913 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:51.054935 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.054898 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tf8m\" (UniqueName: \"kubernetes.io/projected/074b93c3-4396-48d8-adfe-9ad1672009e1-kube-api-access-4tf8m\") pod \"074b93c3-4396-48d8-adfe-9ad1672009e1\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " Apr 17 17:34:51.055130 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.054961 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-bundle\") pod \"074b93c3-4396-48d8-adfe-9ad1672009e1\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " Apr 17 17:34:51.055130 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.054996 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-util\") pod \"074b93c3-4396-48d8-adfe-9ad1672009e1\" (UID: \"074b93c3-4396-48d8-adfe-9ad1672009e1\") " Apr 17 17:34:51.055931 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.055895 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-bundle" (OuterVolumeSpecName: "bundle") pod "074b93c3-4396-48d8-adfe-9ad1672009e1" (UID: "074b93c3-4396-48d8-adfe-9ad1672009e1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:34:51.057266 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.057244 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074b93c3-4396-48d8-adfe-9ad1672009e1-kube-api-access-4tf8m" (OuterVolumeSpecName: "kube-api-access-4tf8m") pod "074b93c3-4396-48d8-adfe-9ad1672009e1" (UID: "074b93c3-4396-48d8-adfe-9ad1672009e1"). InnerVolumeSpecName "kube-api-access-4tf8m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:51.060297 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.060269 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-util" (OuterVolumeSpecName: "util") pod "074b93c3-4396-48d8-adfe-9ad1672009e1" (UID: "074b93c3-4396-48d8-adfe-9ad1672009e1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:34:51.156075 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.155947 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:51.156075 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.155996 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4tf8m\" (UniqueName: \"kubernetes.io/projected/074b93c3-4396-48d8-adfe-9ad1672009e1-kube-api-access-4tf8m\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:51.156075 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.156012 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/074b93c3-4396-48d8-adfe-9ad1672009e1-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:34:51.849302 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.849270 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" Apr 17 17:34:51.849703 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.849263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835srlmm" event={"ID":"074b93c3-4396-48d8-adfe-9ad1672009e1","Type":"ContainerDied","Data":"6d9b61212bece56ab498f6ef7ea4f82da862a603f81ae33db37ca092873707d5"} Apr 17 17:34:51.849703 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:51.849377 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d9b61212bece56ab498f6ef7ea4f82da862a603f81ae33db37ca092873707d5" Apr 17 17:34:56.955265 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.955229 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56"] Apr 17 17:34:56.955719 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.955540 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerName="pull" Apr 17 17:34:56.955719 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.955552 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerName="pull" Apr 17 17:34:56.955719 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.955562 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerName="extract" Apr 17 17:34:56.955719 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.955567 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerName="extract" Apr 17 17:34:56.955719 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.955581 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerName="util" Apr 17 17:34:56.955719 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.955586 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerName="util" Apr 17 17:34:56.955719 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.955648 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="074b93c3-4396-48d8-adfe-9ad1672009e1" containerName="extract" Apr 17 17:34:56.958427 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.958408 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:56.967421 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.967400 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:34:56.967510 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.967410 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:34:56.968367 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:56.968352 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5w67t\"" Apr 17 17:34:57.000177 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.000132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56"] Apr 17 17:34:57.005174 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.005144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2gp7\" (UniqueName: \"kubernetes.io/projected/06410a3f-baa5-4d4e-878b-6e3f4d7db364-kube-api-access-l2gp7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.005325 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.005190 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.005325 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.005286 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.106299 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.106254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2gp7\" (UniqueName: \"kubernetes.io/projected/06410a3f-baa5-4d4e-878b-6e3f4d7db364-kube-api-access-l2gp7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.106515 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.106320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.106515 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.106358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.106792 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.106771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.106850 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.106806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.116448 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.116413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2gp7\" (UniqueName: \"kubernetes.io/projected/06410a3f-baa5-4d4e-878b-6e3f4d7db364-kube-api-access-l2gp7\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.267354 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.267324 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:34:57.417261 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.417233 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56"] Apr 17 17:34:57.418954 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:34:57.418916 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06410a3f_baa5_4d4e_878b_6e3f4d7db364.slice/crio-727aded83a59f496424f0f4bb1b85a36cf61f81b23273bad6d18979fd0dedabe WatchSource:0}: Error finding container 727aded83a59f496424f0f4bb1b85a36cf61f81b23273bad6d18979fd0dedabe: Status 404 returned error can't find the container with id 727aded83a59f496424f0f4bb1b85a36cf61f81b23273bad6d18979fd0dedabe Apr 17 17:34:57.874520 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.874431 2576 generic.go:358] "Generic (PLEG): container finished" podID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerID="e3030f2b3c2d8d80e2888c167139497e4427ef83d99e4c95e3e0df4354d58c32" exitCode=0 Apr 17 17:34:57.874667 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.874521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" event={"ID":"06410a3f-baa5-4d4e-878b-6e3f4d7db364","Type":"ContainerDied","Data":"e3030f2b3c2d8d80e2888c167139497e4427ef83d99e4c95e3e0df4354d58c32"} Apr 17 17:34:57.874667 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:57.874560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" event={"ID":"06410a3f-baa5-4d4e-878b-6e3f4d7db364","Type":"ContainerStarted","Data":"727aded83a59f496424f0f4bb1b85a36cf61f81b23273bad6d18979fd0dedabe"} Apr 17 17:34:58.880305 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:58.880264 2576 generic.go:358] "Generic (PLEG): container finished" podID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerID="229652eaac131cfed63f4f8b6ab1fb0726fe22a0f2c3a0745e69353f16db1194" exitCode=0 Apr 17 17:34:58.880659 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:58.880312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" event={"ID":"06410a3f-baa5-4d4e-878b-6e3f4d7db364","Type":"ContainerDied","Data":"229652eaac131cfed63f4f8b6ab1fb0726fe22a0f2c3a0745e69353f16db1194"} Apr 17 17:34:59.885109 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:59.885071 2576 generic.go:358] "Generic (PLEG): container finished" podID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerID="c93ac8790986e46bec8462b2bde727ab6b844f09b63c203905a03084f16047b7" exitCode=0 Apr 17 17:34:59.885503 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:34:59.885158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" event={"ID":"06410a3f-baa5-4d4e-878b-6e3f4d7db364","Type":"ContainerDied","Data":"c93ac8790986e46bec8462b2bde727ab6b844f09b63c203905a03084f16047b7"} Apr 17 17:35:01.010209 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.010182 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:35:01.038156 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.038122 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-util\") pod \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " Apr 17 17:35:01.038303 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.038193 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-bundle\") pod \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " Apr 17 17:35:01.038303 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.038232 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2gp7\" (UniqueName: \"kubernetes.io/projected/06410a3f-baa5-4d4e-878b-6e3f4d7db364-kube-api-access-l2gp7\") pod \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\" (UID: \"06410a3f-baa5-4d4e-878b-6e3f4d7db364\") " Apr 17 17:35:01.039491 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.039460 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-bundle" (OuterVolumeSpecName: "bundle") pod "06410a3f-baa5-4d4e-878b-6e3f4d7db364" (UID: "06410a3f-baa5-4d4e-878b-6e3f4d7db364"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:35:01.040475 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.040449 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06410a3f-baa5-4d4e-878b-6e3f4d7db364-kube-api-access-l2gp7" (OuterVolumeSpecName: "kube-api-access-l2gp7") pod "06410a3f-baa5-4d4e-878b-6e3f4d7db364" (UID: "06410a3f-baa5-4d4e-878b-6e3f4d7db364"). InnerVolumeSpecName "kube-api-access-l2gp7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:35:01.043835 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.043809 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-util" (OuterVolumeSpecName: "util") pod "06410a3f-baa5-4d4e-878b-6e3f4d7db364" (UID: "06410a3f-baa5-4d4e-878b-6e3f4d7db364"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:35:01.139595 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.139510 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:35:01.139595 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.139541 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/06410a3f-baa5-4d4e-878b-6e3f4d7db364-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:35:01.139595 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.139553 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2gp7\" (UniqueName: \"kubernetes.io/projected/06410a3f-baa5-4d4e-878b-6e3f4d7db364-kube-api-access-l2gp7\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:35:01.894856 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.894825 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" Apr 17 17:35:01.894856 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.894837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2n7n56" event={"ID":"06410a3f-baa5-4d4e-878b-6e3f4d7db364","Type":"ContainerDied","Data":"727aded83a59f496424f0f4bb1b85a36cf61f81b23273bad6d18979fd0dedabe"} Apr 17 17:35:01.895095 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:01.894869 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="727aded83a59f496424f0f4bb1b85a36cf61f81b23273bad6d18979fd0dedabe" Apr 17 17:35:55.881462 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.881396 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5554d84dd4-42zlp"] Apr 17 17:35:55.883405 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.882234 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerName="extract" Apr 17 17:35:55.883405 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.882258 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerName="extract" Apr 17 17:35:55.883405 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.882298 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerName="util" Apr 17 17:35:55.883405 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.882313 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerName="util" Apr 17 17:35:55.883405 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.882334 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerName="pull" Apr 17 17:35:55.883405 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.882344 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerName="pull" Apr 17 17:35:55.883405 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.882548 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="06410a3f-baa5-4d4e-878b-6e3f4d7db364" containerName="extract" Apr 17 17:35:55.887557 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.887530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:55.893128 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:55.893101 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5554d84dd4-42zlp"] Apr 17 17:35:56.014947 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.014908 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmxnf\" (UniqueName: \"kubernetes.io/projected/65400f04-d335-4c62-be8b-b4561c51315e-kube-api-access-qmxnf\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.014947 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.014944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65400f04-d335-4c62-be8b-b4561c51315e-console-serving-cert\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.015246 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.014968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-console-config\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.015246 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.015088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-service-ca\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.015246 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.015185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-oauth-serving-cert\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.015371 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.015259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65400f04-d335-4c62-be8b-b4561c51315e-console-oauth-config\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.015371 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.015283 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-trusted-ca-bundle\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.115782 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.115742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-oauth-serving-cert\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.115981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.115809 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65400f04-d335-4c62-be8b-b4561c51315e-console-oauth-config\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.115981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.115831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-trusted-ca-bundle\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.115981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.115875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmxnf\" (UniqueName: \"kubernetes.io/projected/65400f04-d335-4c62-be8b-b4561c51315e-kube-api-access-qmxnf\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.115981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.115927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65400f04-d335-4c62-be8b-b4561c51315e-console-serving-cert\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.115981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.115961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-console-config\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.116301 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.116198 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-service-ca\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.116681 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.116654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-console-config\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.116809 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.116756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-trusted-ca-bundle\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.116809 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.116782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-service-ca\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.116954 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.116930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65400f04-d335-4c62-be8b-b4561c51315e-oauth-serving-cert\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.118473 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.118454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65400f04-d335-4c62-be8b-b4561c51315e-console-oauth-config\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.118569 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.118551 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65400f04-d335-4c62-be8b-b4561c51315e-console-serving-cert\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.126683 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.126655 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmxnf\" (UniqueName: \"kubernetes.io/projected/65400f04-d335-4c62-be8b-b4561c51315e-kube-api-access-qmxnf\") pod \"console-5554d84dd4-42zlp\" (UID: \"65400f04-d335-4c62-be8b-b4561c51315e\") " pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.198306 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.198213 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:35:56.333317 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:56.333287 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5554d84dd4-42zlp"] Apr 17 17:35:56.334928 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:35:56.334896 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65400f04_d335_4c62_be8b_b4561c51315e.slice/crio-2076872f40b4e5d3fb08b21f413733821e060042883272bbb2c391ba9b77e0d2 WatchSource:0}: Error finding container 2076872f40b4e5d3fb08b21f413733821e060042883272bbb2c391ba9b77e0d2: Status 404 returned error can't find the container with id 2076872f40b4e5d3fb08b21f413733821e060042883272bbb2c391ba9b77e0d2 Apr 17 17:35:57.110944 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:57.110905 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5554d84dd4-42zlp" event={"ID":"65400f04-d335-4c62-be8b-b4561c51315e","Type":"ContainerStarted","Data":"10df1bf1e79b1f292d5e0884aad5be0a7febf0baf5cf9d05eda32cd3d69f6a46"} Apr 17 17:35:57.110944 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:57.110941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5554d84dd4-42zlp" event={"ID":"65400f04-d335-4c62-be8b-b4561c51315e","Type":"ContainerStarted","Data":"2076872f40b4e5d3fb08b21f413733821e060042883272bbb2c391ba9b77e0d2"} Apr 17 17:35:57.131222 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:35:57.131171 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5554d84dd4-42zlp" podStartSLOduration=2.131153939 podStartE2EDuration="2.131153939s" podCreationTimestamp="2026-04-17 17:35:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:35:57.128341978 +0000 UTC m=+583.703690270" watchObservedRunningTime="2026-04-17 17:35:57.131153939 +0000 UTC m=+583.706502227" Apr 17 17:36:01.364364 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.364325 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x"] Apr 17 17:36:01.366939 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.366920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.370159 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.370123 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:36:01.370159 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.370125 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:36:01.370326 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.370126 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-l59xh\"" Apr 17 17:36:01.376831 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.376801 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x"] Apr 17 17:36:01.461679 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.461641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.461856 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.461700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6qt\" (UniqueName: \"kubernetes.io/projected/b98faa71-de81-49c6-a9e7-add858e6a506-kube-api-access-9n6qt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.461856 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.461730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.563010 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.562964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6qt\" (UniqueName: \"kubernetes.io/projected/b98faa71-de81-49c6-a9e7-add858e6a506-kube-api-access-9n6qt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.563010 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.563012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.563248 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.563149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.563555 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.563532 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.563591 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.563546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.572235 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.572201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6qt\" (UniqueName: \"kubernetes.io/projected/b98faa71-de81-49c6-a9e7-add858e6a506-kube-api-access-9n6qt\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.677248 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.677158 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:01.807265 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.807230 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x"] Apr 17 17:36:01.809360 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:36:01.809331 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb98faa71_de81_49c6_a9e7_add858e6a506.slice/crio-75409113601cd4186080d94c4f83d52858e8a9cc4a6785eb3967a17e1676496b WatchSource:0}: Error finding container 75409113601cd4186080d94c4f83d52858e8a9cc4a6785eb3967a17e1676496b: Status 404 returned error can't find the container with id 75409113601cd4186080d94c4f83d52858e8a9cc4a6785eb3967a17e1676496b Apr 17 17:36:01.965187 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.965085 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr"] Apr 17 17:36:01.967548 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.967530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:01.978245 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:01.978214 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr"] Apr 17 17:36:02.067315 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.067279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.067315 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.067321 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.067542 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.067363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvdcc\" (UniqueName: \"kubernetes.io/projected/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-kube-api-access-gvdcc\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.132495 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.132461 2576 generic.go:358] "Generic (PLEG): container finished" podID="b98faa71-de81-49c6-a9e7-add858e6a506" containerID="8b954e47661bce9a4d69d5b0381c62b5b26ba989dea22708e452d4f9ba17b9ec" exitCode=0 Apr 17 17:36:02.132655 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.132506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" event={"ID":"b98faa71-de81-49c6-a9e7-add858e6a506","Type":"ContainerDied","Data":"8b954e47661bce9a4d69d5b0381c62b5b26ba989dea22708e452d4f9ba17b9ec"} Apr 17 17:36:02.132655 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.132540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" event={"ID":"b98faa71-de81-49c6-a9e7-add858e6a506","Type":"ContainerStarted","Data":"75409113601cd4186080d94c4f83d52858e8a9cc4a6785eb3967a17e1676496b"} Apr 17 17:36:02.167990 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.167949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.168193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.168005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvdcc\" (UniqueName: \"kubernetes.io/projected/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-kube-api-access-gvdcc\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.168193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.168112 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.168383 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.168363 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.168473 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.168451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.177173 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.177145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvdcc\" (UniqueName: \"kubernetes.io/projected/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-kube-api-access-gvdcc\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.277624 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.277589 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:02.409747 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.409718 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr"] Apr 17 17:36:02.411417 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:36:02.411386 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b7832f4_4a06_4efa_8bdb_8a8986f6002d.slice/crio-30fba8c84f4980106b07e84f74f2a9377a79c3b4c3770ec2fbec44b7a9af851e WatchSource:0}: Error finding container 30fba8c84f4980106b07e84f74f2a9377a79c3b4c3770ec2fbec44b7a9af851e: Status 404 returned error can't find the container with id 30fba8c84f4980106b07e84f74f2a9377a79c3b4c3770ec2fbec44b7a9af851e Apr 17 17:36:02.568987 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.568907 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj"] Apr 17 17:36:02.572337 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.572309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.580301 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.580264 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj"] Apr 17 17:36:02.672755 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.672719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxrcs\" (UniqueName: \"kubernetes.io/projected/febceaca-a96e-48ab-89af-43744615033b-kube-api-access-bxrcs\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.672935 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.672855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.672935 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.672889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.773511 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.773478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.773698 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.773524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.773698 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.773592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxrcs\" (UniqueName: \"kubernetes.io/projected/febceaca-a96e-48ab-89af-43744615033b-kube-api-access-bxrcs\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.773851 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.773828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.773911 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.773872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.782412 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.782380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxrcs\" (UniqueName: \"kubernetes.io/projected/febceaca-a96e-48ab-89af-43744615033b-kube-api-access-bxrcs\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.893226 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.893193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:02.963487 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.963450 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g"] Apr 17 17:36:02.967424 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.967401 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:02.975065 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:02.975000 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g"] Apr 17 17:36:03.033008 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.032981 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj"] Apr 17 17:36:03.040083 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:36:03.040056 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfebceaca_a96e_48ab_89af_43744615033b.slice/crio-13961a32f62f1d7c0d269be2a67663fbb2d5e74fdd3d45e06d819a3579081230 WatchSource:0}: Error finding container 13961a32f62f1d7c0d269be2a67663fbb2d5e74fdd3d45e06d819a3579081230: Status 404 returned error can't find the container with id 13961a32f62f1d7c0d269be2a67663fbb2d5e74fdd3d45e06d819a3579081230 Apr 17 17:36:03.076187 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.076153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.076340 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.076195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.076340 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.076227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmgrj\" (UniqueName: \"kubernetes.io/projected/7be7913a-73b8-4554-a4a7-945a2dab520d-kube-api-access-tmgrj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.137546 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.137456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" event={"ID":"febceaca-a96e-48ab-89af-43744615033b","Type":"ContainerStarted","Data":"5fa7d0e9deb32a483c999b157c32789f44f7fc6d537d3ed879606b1858bccb3d"} Apr 17 17:36:03.137546 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.137501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" event={"ID":"febceaca-a96e-48ab-89af-43744615033b","Type":"ContainerStarted","Data":"13961a32f62f1d7c0d269be2a67663fbb2d5e74fdd3d45e06d819a3579081230"} Apr 17 17:36:03.138946 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.138919 2576 generic.go:358] "Generic (PLEG): container finished" podID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerID="9052c24368b8173f660a5a15194b40b16b94f454e22ad76908ec33f08dc635de" exitCode=0 Apr 17 17:36:03.139075 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.139017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" event={"ID":"1b7832f4-4a06-4efa-8bdb-8a8986f6002d","Type":"ContainerDied","Data":"9052c24368b8173f660a5a15194b40b16b94f454e22ad76908ec33f08dc635de"} Apr 17 17:36:03.139138 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.139075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" event={"ID":"1b7832f4-4a06-4efa-8bdb-8a8986f6002d","Type":"ContainerStarted","Data":"30fba8c84f4980106b07e84f74f2a9377a79c3b4c3770ec2fbec44b7a9af851e"} Apr 17 17:36:03.140992 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.140968 2576 generic.go:358] "Generic (PLEG): container finished" podID="b98faa71-de81-49c6-a9e7-add858e6a506" containerID="41ead866ccfc3099de9b4c7808e50d14a12f19aaab728008da097312cf4fcccc" exitCode=0 Apr 17 17:36:03.141118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.141060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" event={"ID":"b98faa71-de81-49c6-a9e7-add858e6a506","Type":"ContainerDied","Data":"41ead866ccfc3099de9b4c7808e50d14a12f19aaab728008da097312cf4fcccc"} Apr 17 17:36:03.179787 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.179550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmgrj\" (UniqueName: \"kubernetes.io/projected/7be7913a-73b8-4554-a4a7-945a2dab520d-kube-api-access-tmgrj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.179995 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.179781 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.179995 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.179830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.180327 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.180305 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.180416 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.180397 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.188244 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.188221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmgrj\" (UniqueName: \"kubernetes.io/projected/7be7913a-73b8-4554-a4a7-945a2dab520d-kube-api-access-tmgrj\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.347181 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.347140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:03.476917 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:03.476890 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g"] Apr 17 17:36:03.478973 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:36:03.478939 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7be7913a_73b8_4554_a4a7_945a2dab520d.slice/crio-cecc12546e4238391b485ebd5283ece52dea2b3f3aaf5ff49d20b74a2fe52d4a WatchSource:0}: Error finding container cecc12546e4238391b485ebd5283ece52dea2b3f3aaf5ff49d20b74a2fe52d4a: Status 404 returned error can't find the container with id cecc12546e4238391b485ebd5283ece52dea2b3f3aaf5ff49d20b74a2fe52d4a Apr 17 17:36:04.146632 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:04.146595 2576 generic.go:358] "Generic (PLEG): container finished" podID="b98faa71-de81-49c6-a9e7-add858e6a506" containerID="452957b276e50e95d185c378f45030e6e7069fb1596dda8ea8ed5465bc8ff2bb" exitCode=0 Apr 17 17:36:04.146834 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:04.146682 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" event={"ID":"b98faa71-de81-49c6-a9e7-add858e6a506","Type":"ContainerDied","Data":"452957b276e50e95d185c378f45030e6e7069fb1596dda8ea8ed5465bc8ff2bb"} Apr 17 17:36:04.147995 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:04.147972 2576 generic.go:358] "Generic (PLEG): container finished" podID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerID="9820c4e7ec7465ff4d734859800e28c2896cb51d0b8ef1800d73a368e8df0ca4" exitCode=0 Apr 17 17:36:04.148129 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:04.148052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" event={"ID":"7be7913a-73b8-4554-a4a7-945a2dab520d","Type":"ContainerDied","Data":"9820c4e7ec7465ff4d734859800e28c2896cb51d0b8ef1800d73a368e8df0ca4"} Apr 17 17:36:04.148129 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:04.148080 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" event={"ID":"7be7913a-73b8-4554-a4a7-945a2dab520d","Type":"ContainerStarted","Data":"cecc12546e4238391b485ebd5283ece52dea2b3f3aaf5ff49d20b74a2fe52d4a"} Apr 17 17:36:04.149485 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:04.149464 2576 generic.go:358] "Generic (PLEG): container finished" podID="febceaca-a96e-48ab-89af-43744615033b" containerID="5fa7d0e9deb32a483c999b157c32789f44f7fc6d537d3ed879606b1858bccb3d" exitCode=0 Apr 17 17:36:04.149598 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:04.149521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" event={"ID":"febceaca-a96e-48ab-89af-43744615033b","Type":"ContainerDied","Data":"5fa7d0e9deb32a483c999b157c32789f44f7fc6d537d3ed879606b1858bccb3d"} Apr 17 17:36:04.151478 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:04.151455 2576 generic.go:358] "Generic (PLEG): container finished" podID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerID="c356db3c5835fbf640b19514008d7db9f26b686f74e6370b18bc6b0b2eb69050" exitCode=0 Apr 17 17:36:04.151571 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:04.151527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" event={"ID":"1b7832f4-4a06-4efa-8bdb-8a8986f6002d","Type":"ContainerDied","Data":"c356db3c5835fbf640b19514008d7db9f26b686f74e6370b18bc6b0b2eb69050"} Apr 17 17:36:05.157157 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.157123 2576 generic.go:358] "Generic (PLEG): container finished" podID="febceaca-a96e-48ab-89af-43744615033b" containerID="f3b9f661238ec724c915cdb01115fc31273397d979779d8bba789a2342484e53" exitCode=0 Apr 17 17:36:05.157572 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.157200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" event={"ID":"febceaca-a96e-48ab-89af-43744615033b","Type":"ContainerDied","Data":"f3b9f661238ec724c915cdb01115fc31273397d979779d8bba789a2342484e53"} Apr 17 17:36:05.159654 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.159623 2576 generic.go:358] "Generic (PLEG): container finished" podID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerID="5051fb26f7e161501bcfc7d90db493558ce5271a8d592a521afa3a5cd4c967fb" exitCode=0 Apr 17 17:36:05.159794 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.159716 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" event={"ID":"1b7832f4-4a06-4efa-8bdb-8a8986f6002d","Type":"ContainerDied","Data":"5051fb26f7e161501bcfc7d90db493558ce5271a8d592a521afa3a5cd4c967fb"} Apr 17 17:36:05.295260 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.295227 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:05.399777 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.399739 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n6qt\" (UniqueName: \"kubernetes.io/projected/b98faa71-de81-49c6-a9e7-add858e6a506-kube-api-access-9n6qt\") pod \"b98faa71-de81-49c6-a9e7-add858e6a506\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " Apr 17 17:36:05.399937 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.399827 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-bundle\") pod \"b98faa71-de81-49c6-a9e7-add858e6a506\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " Apr 17 17:36:05.399937 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.399927 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-util\") pod \"b98faa71-de81-49c6-a9e7-add858e6a506\" (UID: \"b98faa71-de81-49c6-a9e7-add858e6a506\") " Apr 17 17:36:05.400380 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.400351 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-bundle" (OuterVolumeSpecName: "bundle") pod "b98faa71-de81-49c6-a9e7-add858e6a506" (UID: "b98faa71-de81-49c6-a9e7-add858e6a506"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:05.402017 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.401995 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98faa71-de81-49c6-a9e7-add858e6a506-kube-api-access-9n6qt" (OuterVolumeSpecName: "kube-api-access-9n6qt") pod "b98faa71-de81-49c6-a9e7-add858e6a506" (UID: "b98faa71-de81-49c6-a9e7-add858e6a506"). InnerVolumeSpecName "kube-api-access-9n6qt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:05.405766 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.405725 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-util" (OuterVolumeSpecName: "util") pod "b98faa71-de81-49c6-a9e7-add858e6a506" (UID: "b98faa71-de81-49c6-a9e7-add858e6a506"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:05.500655 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.500572 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:05.500655 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.500601 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9n6qt\" (UniqueName: \"kubernetes.io/projected/b98faa71-de81-49c6-a9e7-add858e6a506-kube-api-access-9n6qt\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:05.500655 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:05.500610 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b98faa71-de81-49c6-a9e7-add858e6a506-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:06.164801 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.164757 2576 generic.go:358] "Generic (PLEG): container finished" podID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerID="c00e0796047a52a6319aa8dcc471d0a9051ecd57dd4536d0951f75a7f2856864" exitCode=0 Apr 17 17:36:06.165280 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.164845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" event={"ID":"7be7913a-73b8-4554-a4a7-945a2dab520d","Type":"ContainerDied","Data":"c00e0796047a52a6319aa8dcc471d0a9051ecd57dd4536d0951f75a7f2856864"} Apr 17 17:36:06.167129 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.167102 2576 generic.go:358] "Generic (PLEG): container finished" podID="febceaca-a96e-48ab-89af-43744615033b" containerID="24ba233af02fa890f8f263bf993dc85dbe0be6a60a5cd9fdbd5065c274c51712" exitCode=0 Apr 17 17:36:06.167235 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.167168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" event={"ID":"febceaca-a96e-48ab-89af-43744615033b","Type":"ContainerDied","Data":"24ba233af02fa890f8f263bf993dc85dbe0be6a60a5cd9fdbd5065c274c51712"} Apr 17 17:36:06.168862 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.168839 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" event={"ID":"b98faa71-de81-49c6-a9e7-add858e6a506","Type":"ContainerDied","Data":"75409113601cd4186080d94c4f83d52858e8a9cc4a6785eb3967a17e1676496b"} Apr 17 17:36:06.168944 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.168866 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75409113601cd4186080d94c4f83d52858e8a9cc4a6785eb3967a17e1676496b" Apr 17 17:36:06.169128 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.169107 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x" Apr 17 17:36:06.199148 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.199112 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:36:06.199148 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.199154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:36:06.204810 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.204782 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:36:06.303287 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.303263 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:06.409516 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.409480 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvdcc\" (UniqueName: \"kubernetes.io/projected/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-kube-api-access-gvdcc\") pod \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " Apr 17 17:36:06.409707 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.409578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-util\") pod \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " Apr 17 17:36:06.409707 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.409642 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-bundle\") pod \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\" (UID: \"1b7832f4-4a06-4efa-8bdb-8a8986f6002d\") " Apr 17 17:36:06.410118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.410091 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-bundle" (OuterVolumeSpecName: "bundle") pod "1b7832f4-4a06-4efa-8bdb-8a8986f6002d" (UID: "1b7832f4-4a06-4efa-8bdb-8a8986f6002d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:06.411693 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.411663 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-kube-api-access-gvdcc" (OuterVolumeSpecName: "kube-api-access-gvdcc") pod "1b7832f4-4a06-4efa-8bdb-8a8986f6002d" (UID: "1b7832f4-4a06-4efa-8bdb-8a8986f6002d"). InnerVolumeSpecName "kube-api-access-gvdcc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:06.415164 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.415142 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-util" (OuterVolumeSpecName: "util") pod "1b7832f4-4a06-4efa-8bdb-8a8986f6002d" (UID: "1b7832f4-4a06-4efa-8bdb-8a8986f6002d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:06.510286 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.510237 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gvdcc\" (UniqueName: \"kubernetes.io/projected/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-kube-api-access-gvdcc\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:06.510286 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.510281 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:06.510286 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:06.510295 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1b7832f4-4a06-4efa-8bdb-8a8986f6002d-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:07.174156 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.174118 2576 generic.go:358] "Generic (PLEG): container finished" podID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerID="5e5bd336dca8cc739711e17197c508096745d1e9a3e8dea940e252b797fc4972" exitCode=0 Apr 17 17:36:07.174588 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.174193 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" event={"ID":"7be7913a-73b8-4554-a4a7-945a2dab520d","Type":"ContainerDied","Data":"5e5bd336dca8cc739711e17197c508096745d1e9a3e8dea940e252b797fc4972"} Apr 17 17:36:07.175843 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.175817 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" Apr 17 17:36:07.175843 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.175830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr" event={"ID":"1b7832f4-4a06-4efa-8bdb-8a8986f6002d","Type":"ContainerDied","Data":"30fba8c84f4980106b07e84f74f2a9377a79c3b4c3770ec2fbec44b7a9af851e"} Apr 17 17:36:07.175998 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.175859 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30fba8c84f4980106b07e84f74f2a9377a79c3b4c3770ec2fbec44b7a9af851e" Apr 17 17:36:07.180348 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.180321 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5554d84dd4-42zlp" Apr 17 17:36:07.247381 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.247349 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67999b787-qb6tf"] Apr 17 17:36:07.328320 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.328295 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:07.418974 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.418938 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxrcs\" (UniqueName: \"kubernetes.io/projected/febceaca-a96e-48ab-89af-43744615033b-kube-api-access-bxrcs\") pod \"febceaca-a96e-48ab-89af-43744615033b\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " Apr 17 17:36:07.419153 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.418993 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-util\") pod \"febceaca-a96e-48ab-89af-43744615033b\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " Apr 17 17:36:07.419223 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.419149 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-bundle\") pod \"febceaca-a96e-48ab-89af-43744615033b\" (UID: \"febceaca-a96e-48ab-89af-43744615033b\") " Apr 17 17:36:07.419753 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.419721 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-bundle" (OuterVolumeSpecName: "bundle") pod "febceaca-a96e-48ab-89af-43744615033b" (UID: "febceaca-a96e-48ab-89af-43744615033b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:07.421174 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.421144 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/febceaca-a96e-48ab-89af-43744615033b-kube-api-access-bxrcs" (OuterVolumeSpecName: "kube-api-access-bxrcs") pod "febceaca-a96e-48ab-89af-43744615033b" (UID: "febceaca-a96e-48ab-89af-43744615033b"). InnerVolumeSpecName "kube-api-access-bxrcs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:07.426170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.426092 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-util" (OuterVolumeSpecName: "util") pod "febceaca-a96e-48ab-89af-43744615033b" (UID: "febceaca-a96e-48ab-89af-43744615033b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:07.520347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.520306 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:07.520347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.520337 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bxrcs\" (UniqueName: \"kubernetes.io/projected/febceaca-a96e-48ab-89af-43744615033b-kube-api-access-bxrcs\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:07.520347 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:07.520349 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/febceaca-a96e-48ab-89af-43744615033b-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:08.181770 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.181740 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" Apr 17 17:36:08.182243 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.181762 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj" event={"ID":"febceaca-a96e-48ab-89af-43744615033b","Type":"ContainerDied","Data":"13961a32f62f1d7c0d269be2a67663fbb2d5e74fdd3d45e06d819a3579081230"} Apr 17 17:36:08.182243 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.181803 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13961a32f62f1d7c0d269be2a67663fbb2d5e74fdd3d45e06d819a3579081230" Apr 17 17:36:08.311716 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.311691 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:08.428367 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.428330 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmgrj\" (UniqueName: \"kubernetes.io/projected/7be7913a-73b8-4554-a4a7-945a2dab520d-kube-api-access-tmgrj\") pod \"7be7913a-73b8-4554-a4a7-945a2dab520d\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " Apr 17 17:36:08.428521 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.428386 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-util\") pod \"7be7913a-73b8-4554-a4a7-945a2dab520d\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " Apr 17 17:36:08.428521 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.428419 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-bundle\") pod \"7be7913a-73b8-4554-a4a7-945a2dab520d\" (UID: \"7be7913a-73b8-4554-a4a7-945a2dab520d\") " Apr 17 17:36:08.428991 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.428955 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-bundle" (OuterVolumeSpecName: "bundle") pod "7be7913a-73b8-4554-a4a7-945a2dab520d" (UID: "7be7913a-73b8-4554-a4a7-945a2dab520d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:08.430445 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.430423 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7be7913a-73b8-4554-a4a7-945a2dab520d-kube-api-access-tmgrj" (OuterVolumeSpecName: "kube-api-access-tmgrj") pod "7be7913a-73b8-4554-a4a7-945a2dab520d" (UID: "7be7913a-73b8-4554-a4a7-945a2dab520d"). InnerVolumeSpecName "kube-api-access-tmgrj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:08.433942 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.433920 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-util" (OuterVolumeSpecName: "util") pod "7be7913a-73b8-4554-a4a7-945a2dab520d" (UID: "7be7913a-73b8-4554-a4a7-945a2dab520d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:08.529803 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.529761 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:08.529803 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.529799 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tmgrj\" (UniqueName: \"kubernetes.io/projected/7be7913a-73b8-4554-a4a7-945a2dab520d-kube-api-access-tmgrj\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:08.529803 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:08.529809 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7be7913a-73b8-4554-a4a7-945a2dab520d-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:09.187839 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:09.187815 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" Apr 17 17:36:09.188271 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:09.187814 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g" event={"ID":"7be7913a-73b8-4554-a4a7-945a2dab520d","Type":"ContainerDied","Data":"cecc12546e4238391b485ebd5283ece52dea2b3f3aaf5ff49d20b74a2fe52d4a"} Apr 17 17:36:09.188271 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:09.187916 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cecc12546e4238391b485ebd5283ece52dea2b3f3aaf5ff49d20b74a2fe52d4a" Apr 17 17:36:13.936114 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:13.936089 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:36:13.936573 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:13.936089 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:36:13.943544 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:13.943518 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:36:13.943703 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:13.943522 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:36:32.271919 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.271804 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67999b787-qb6tf" podUID="1de51bba-e145-42a6-842c-a3db6e008dd4" containerName="console" containerID="cri-o://fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792" gracePeriod=15 Apr 17 17:36:32.513413 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.513390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67999b787-qb6tf_1de51bba-e145-42a6-842c-a3db6e008dd4/console/0.log" Apr 17 17:36:32.513550 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.513451 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:36:32.538623 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.538539 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-service-ca\") pod \"1de51bba-e145-42a6-842c-a3db6e008dd4\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " Apr 17 17:36:32.538623 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.538585 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-trusted-ca-bundle\") pod \"1de51bba-e145-42a6-842c-a3db6e008dd4\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " Apr 17 17:36:32.538623 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.538605 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-oauth-config\") pod \"1de51bba-e145-42a6-842c-a3db6e008dd4\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " Apr 17 17:36:32.538892 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.538629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-console-config\") pod \"1de51bba-e145-42a6-842c-a3db6e008dd4\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " Apr 17 17:36:32.538892 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.538647 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlrwn\" (UniqueName: \"kubernetes.io/projected/1de51bba-e145-42a6-842c-a3db6e008dd4-kube-api-access-jlrwn\") pod \"1de51bba-e145-42a6-842c-a3db6e008dd4\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " Apr 17 17:36:32.538892 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.538666 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-serving-cert\") pod \"1de51bba-e145-42a6-842c-a3db6e008dd4\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " Apr 17 17:36:32.538892 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.538700 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-oauth-serving-cert\") pod \"1de51bba-e145-42a6-842c-a3db6e008dd4\" (UID: \"1de51bba-e145-42a6-842c-a3db6e008dd4\") " Apr 17 17:36:32.539127 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.538964 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-console-config" (OuterVolumeSpecName: "console-config") pod "1de51bba-e145-42a6-842c-a3db6e008dd4" (UID: "1de51bba-e145-42a6-842c-a3db6e008dd4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:36:32.539218 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.539012 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1de51bba-e145-42a6-842c-a3db6e008dd4" (UID: "1de51bba-e145-42a6-842c-a3db6e008dd4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:36:32.539283 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.539225 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1de51bba-e145-42a6-842c-a3db6e008dd4" (UID: "1de51bba-e145-42a6-842c-a3db6e008dd4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:36:32.539283 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.539238 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-service-ca" (OuterVolumeSpecName: "service-ca") pod "1de51bba-e145-42a6-842c-a3db6e008dd4" (UID: "1de51bba-e145-42a6-842c-a3db6e008dd4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:36:32.540909 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.540882 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1de51bba-e145-42a6-842c-a3db6e008dd4" (UID: "1de51bba-e145-42a6-842c-a3db6e008dd4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:36:32.541066 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.540976 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1de51bba-e145-42a6-842c-a3db6e008dd4" (UID: "1de51bba-e145-42a6-842c-a3db6e008dd4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:36:32.541311 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.541295 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de51bba-e145-42a6-842c-a3db6e008dd4-kube-api-access-jlrwn" (OuterVolumeSpecName: "kube-api-access-jlrwn") pod "1de51bba-e145-42a6-842c-a3db6e008dd4" (UID: "1de51bba-e145-42a6-842c-a3db6e008dd4"). InnerVolumeSpecName "kube-api-access-jlrwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:32.639328 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.639293 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-console-config\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:32.639328 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.639321 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jlrwn\" (UniqueName: \"kubernetes.io/projected/1de51bba-e145-42a6-842c-a3db6e008dd4-kube-api-access-jlrwn\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:32.639328 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.639332 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-serving-cert\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:32.639328 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.639341 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-oauth-serving-cert\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:32.639599 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.639349 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-service-ca\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:32.639599 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.639358 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1de51bba-e145-42a6-842c-a3db6e008dd4-trusted-ca-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:32.639599 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:32.639366 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1de51bba-e145-42a6-842c-a3db6e008dd4-console-oauth-config\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:36:33.280177 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.280149 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67999b787-qb6tf_1de51bba-e145-42a6-842c-a3db6e008dd4/console/0.log" Apr 17 17:36:33.280609 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.280188 2576 generic.go:358] "Generic (PLEG): container finished" podID="1de51bba-e145-42a6-842c-a3db6e008dd4" containerID="fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792" exitCode=2 Apr 17 17:36:33.280609 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.280243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67999b787-qb6tf" event={"ID":"1de51bba-e145-42a6-842c-a3db6e008dd4","Type":"ContainerDied","Data":"fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792"} Apr 17 17:36:33.280609 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.280254 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67999b787-qb6tf" Apr 17 17:36:33.280609 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.280264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67999b787-qb6tf" event={"ID":"1de51bba-e145-42a6-842c-a3db6e008dd4","Type":"ContainerDied","Data":"be7e5264d7039ebc21a61ee75a50f82adc1fb1481697674ad2a4d0df3d892b81"} Apr 17 17:36:33.280609 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.280279 2576 scope.go:117] "RemoveContainer" containerID="fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792" Apr 17 17:36:33.289858 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.289809 2576 scope.go:117] "RemoveContainer" containerID="fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792" Apr 17 17:36:33.290214 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:36:33.290186 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792\": container with ID starting with fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792 not found: ID does not exist" containerID="fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792" Apr 17 17:36:33.290302 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.290227 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792"} err="failed to get container status \"fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792\": rpc error: code = NotFound desc = could not find container \"fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792\": container with ID starting with fd0cf8fa3423b5ca1d29244146364f1af3cc67140245f10d0fe41c0a80a0b792 not found: ID does not exist" Apr 17 17:36:33.309531 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.309468 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67999b787-qb6tf"] Apr 17 17:36:33.312364 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:33.312332 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67999b787-qb6tf"] Apr 17 17:36:34.003715 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:34.002993 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de51bba-e145-42a6-842c-a3db6e008dd4" path="/var/lib/kubelet/pods/1de51bba-e145-42a6-842c-a3db6e008dd4/volumes" Apr 17 17:36:37.414593 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414553 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2"] Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414872 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b98faa71-de81-49c6-a9e7-add858e6a506" containerName="pull" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414883 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98faa71-de81-49c6-a9e7-add858e6a506" containerName="pull" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414894 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="febceaca-a96e-48ab-89af-43744615033b" containerName="util" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414899 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="febceaca-a96e-48ab-89af-43744615033b" containerName="util" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414910 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerName="util" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414915 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerName="util" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414922 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="febceaca-a96e-48ab-89af-43744615033b" containerName="pull" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414927 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="febceaca-a96e-48ab-89af-43744615033b" containerName="pull" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414933 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="febceaca-a96e-48ab-89af-43744615033b" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414938 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="febceaca-a96e-48ab-89af-43744615033b" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414944 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414950 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414956 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b98faa71-de81-49c6-a9e7-add858e6a506" containerName="util" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414961 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98faa71-de81-49c6-a9e7-add858e6a506" containerName="util" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414967 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerName="util" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414972 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerName="util" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414981 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414985 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414994 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerName="pull" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.414998 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerName="pull" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415006 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b98faa71-de81-49c6-a9e7-add858e6a506" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415010 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98faa71-de81-49c6-a9e7-add858e6a506" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415035 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerName="pull" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415041 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerName="pull" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415051 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1de51bba-e145-42a6-842c-a3db6e008dd4" containerName="console" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415057 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de51bba-e145-42a6-842c-a3db6e008dd4" containerName="console" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415112 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="febceaca-a96e-48ab-89af-43744615033b" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415119 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b7832f4-4a06-4efa-8bdb-8a8986f6002d" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415128 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b98faa71-de81-49c6-a9e7-add858e6a506" containerName="extract" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415135 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1de51bba-e145-42a6-842c-a3db6e008dd4" containerName="console" Apr 17 17:36:37.415118 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.415142 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7be7913a-73b8-4554-a4a7-945a2dab520d" containerName="extract" Apr 17 17:36:37.419292 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.419267 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.422386 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.422200 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:36:37.422386 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.422225 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 17:36:37.422386 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.422242 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:36:37.422941 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.422759 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-l59xh\"" Apr 17 17:36:37.422941 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.422818 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 17:36:37.425207 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.425179 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2"] Apr 17 17:36:37.479059 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.479004 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/30559ecc-aaea-40bd-a0ec-1b5a8131ed89-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-gq5h2\" (UID: \"30559ecc-aaea-40bd-a0ec-1b5a8131ed89\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.479241 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.479078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/30559ecc-aaea-40bd-a0ec-1b5a8131ed89-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-gq5h2\" (UID: \"30559ecc-aaea-40bd-a0ec-1b5a8131ed89\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.479241 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.479179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8xpw\" (UniqueName: \"kubernetes.io/projected/30559ecc-aaea-40bd-a0ec-1b5a8131ed89-kube-api-access-z8xpw\") pod \"kuadrant-console-plugin-6cb54b5c86-gq5h2\" (UID: \"30559ecc-aaea-40bd-a0ec-1b5a8131ed89\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.579766 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.579721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/30559ecc-aaea-40bd-a0ec-1b5a8131ed89-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-gq5h2\" (UID: \"30559ecc-aaea-40bd-a0ec-1b5a8131ed89\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.579766 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.579762 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/30559ecc-aaea-40bd-a0ec-1b5a8131ed89-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-gq5h2\" (UID: \"30559ecc-aaea-40bd-a0ec-1b5a8131ed89\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.579981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.579812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8xpw\" (UniqueName: \"kubernetes.io/projected/30559ecc-aaea-40bd-a0ec-1b5a8131ed89-kube-api-access-z8xpw\") pod \"kuadrant-console-plugin-6cb54b5c86-gq5h2\" (UID: \"30559ecc-aaea-40bd-a0ec-1b5a8131ed89\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.580472 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.580452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/30559ecc-aaea-40bd-a0ec-1b5a8131ed89-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-gq5h2\" (UID: \"30559ecc-aaea-40bd-a0ec-1b5a8131ed89\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.582165 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.582144 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/30559ecc-aaea-40bd-a0ec-1b5a8131ed89-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-gq5h2\" (UID: \"30559ecc-aaea-40bd-a0ec-1b5a8131ed89\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.588112 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.588081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8xpw\" (UniqueName: \"kubernetes.io/projected/30559ecc-aaea-40bd-a0ec-1b5a8131ed89-kube-api-access-z8xpw\") pod \"kuadrant-console-plugin-6cb54b5c86-gq5h2\" (UID: \"30559ecc-aaea-40bd-a0ec-1b5a8131ed89\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.731064 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.730943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" Apr 17 17:36:37.861582 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:37.861552 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2"] Apr 17 17:36:37.863520 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:36:37.863482 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30559ecc_aaea_40bd_a0ec_1b5a8131ed89.slice/crio-241224727c98f35a75b50d203c792109a897ea6a1a16e9473c2b5315023b6640 WatchSource:0}: Error finding container 241224727c98f35a75b50d203c792109a897ea6a1a16e9473c2b5315023b6640: Status 404 returned error can't find the container with id 241224727c98f35a75b50d203c792109a897ea6a1a16e9473c2b5315023b6640 Apr 17 17:36:38.299507 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:36:38.299471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" event={"ID":"30559ecc-aaea-40bd-a0ec-1b5a8131ed89","Type":"ContainerStarted","Data":"241224727c98f35a75b50d203c792109a897ea6a1a16e9473c2b5315023b6640"} Apr 17 17:37:01.408149 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:01.408098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" event={"ID":"30559ecc-aaea-40bd-a0ec-1b5a8131ed89","Type":"ContainerStarted","Data":"3e5442e6f1208e06df7eefe372de199e4ac8d70a6e82fed7016cb3b6f82bce2e"} Apr 17 17:37:01.426776 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:01.426725 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-gq5h2" podStartSLOduration=1.383383836 podStartE2EDuration="24.426709953s" podCreationTimestamp="2026-04-17 17:36:37 +0000 UTC" firstStartedPulling="2026-04-17 17:36:37.86476035 +0000 UTC m=+624.440108616" lastFinishedPulling="2026-04-17 17:37:00.908086467 +0000 UTC m=+647.483434733" observedRunningTime="2026-04-17 17:37:01.423857968 +0000 UTC m=+647.999206256" watchObservedRunningTime="2026-04-17 17:37:01.426709953 +0000 UTC m=+648.002058245" Apr 17 17:37:27.297738 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.297701 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:37:27.382800 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.382762 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:37:27.382968 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.382841 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:37:27.382968 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.382914 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:37:27.385894 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.385866 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 17:37:27.424704 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.424660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-458lx\" (UniqueName: \"kubernetes.io/projected/df99ae14-b508-4fd2-a73b-404f8ddf12c5-kube-api-access-458lx\") pod \"limitador-limitador-78c99df468-82tgf\" (UID: \"df99ae14-b508-4fd2-a73b-404f8ddf12c5\") " pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:37:27.424878 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.424777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/df99ae14-b508-4fd2-a73b-404f8ddf12c5-config-file\") pod \"limitador-limitador-78c99df468-82tgf\" (UID: \"df99ae14-b508-4fd2-a73b-404f8ddf12c5\") " pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:37:27.525808 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.525760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/df99ae14-b508-4fd2-a73b-404f8ddf12c5-config-file\") pod \"limitador-limitador-78c99df468-82tgf\" (UID: \"df99ae14-b508-4fd2-a73b-404f8ddf12c5\") " pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:37:27.525995 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.525851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-458lx\" (UniqueName: \"kubernetes.io/projected/df99ae14-b508-4fd2-a73b-404f8ddf12c5-kube-api-access-458lx\") pod \"limitador-limitador-78c99df468-82tgf\" (UID: \"df99ae14-b508-4fd2-a73b-404f8ddf12c5\") " pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:37:27.526515 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.526494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/df99ae14-b508-4fd2-a73b-404f8ddf12c5-config-file\") pod \"limitador-limitador-78c99df468-82tgf\" (UID: \"df99ae14-b508-4fd2-a73b-404f8ddf12c5\") " pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:37:27.535039 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.535003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-458lx\" (UniqueName: \"kubernetes.io/projected/df99ae14-b508-4fd2-a73b-404f8ddf12c5-kube-api-access-458lx\") pod \"limitador-limitador-78c99df468-82tgf\" (UID: \"df99ae14-b508-4fd2-a73b-404f8ddf12c5\") " pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:37:27.695651 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.695544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:37:27.826368 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.826341 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:37:27.828434 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:37:27.828398 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf99ae14_b508_4fd2_a73b_404f8ddf12c5.slice/crio-12eb448884e247c8edfcc87a3761df2765b6fed52fcb02550cd5d6e82ac5d95c WatchSource:0}: Error finding container 12eb448884e247c8edfcc87a3761df2765b6fed52fcb02550cd5d6e82ac5d95c: Status 404 returned error can't find the container with id 12eb448884e247c8edfcc87a3761df2765b6fed52fcb02550cd5d6e82ac5d95c Apr 17 17:37:27.830167 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:27.830147 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:37:28.511335 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:28.511290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" event={"ID":"df99ae14-b508-4fd2-a73b-404f8ddf12c5","Type":"ContainerStarted","Data":"12eb448884e247c8edfcc87a3761df2765b6fed52fcb02550cd5d6e82ac5d95c"} Apr 17 17:37:30.521891 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:30.521850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" event={"ID":"df99ae14-b508-4fd2-a73b-404f8ddf12c5","Type":"ContainerStarted","Data":"5d24280169d8473632563290f843d82ede44a721a1c3ab1177e0710875d678df"} Apr 17 17:37:30.522315 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:30.521958 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:37:30.542236 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:30.542183 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" podStartSLOduration=0.986791723 podStartE2EDuration="3.542169257s" podCreationTimestamp="2026-04-17 17:37:27 +0000 UTC" firstStartedPulling="2026-04-17 17:37:27.83030677 +0000 UTC m=+674.405655036" lastFinishedPulling="2026-04-17 17:37:30.385684294 +0000 UTC m=+676.961032570" observedRunningTime="2026-04-17 17:37:30.53982177 +0000 UTC m=+677.115170061" watchObservedRunningTime="2026-04-17 17:37:30.542169257 +0000 UTC m=+677.117517544" Apr 17 17:37:41.528586 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:37:41.528554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-82tgf" Apr 17 17:38:01.492038 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.491938 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw"] Apr 17 17:38:01.495740 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.495720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.498466 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.498441 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5w67t\"" Apr 17 17:38:01.498606 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.498441 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 17:38:01.499789 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.499771 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 17:38:01.506719 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.506693 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw"] Apr 17 17:38:01.622831 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.622788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.622994 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.622837 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.622994 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.622950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm9nw\" (UniqueName: \"kubernetes.io/projected/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-kube-api-access-nm9nw\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.724270 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.724229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nm9nw\" (UniqueName: \"kubernetes.io/projected/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-kube-api-access-nm9nw\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.724460 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.724309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.724460 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.724332 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.724680 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.724664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.724753 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.724730 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.733831 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.733795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm9nw\" (UniqueName: \"kubernetes.io/projected/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-kube-api-access-nm9nw\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.805649 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.805613 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:01.943257 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:01.943223 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw"] Apr 17 17:38:01.943785 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:38:01.943759 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda86c6ec0_510e_4b7f_a2bd_088aef1e26a7.slice/crio-9debc0910910bdffa19febb29d97284ddee0065be7229bbcfb3b048ef5ac70f1 WatchSource:0}: Error finding container 9debc0910910bdffa19febb29d97284ddee0065be7229bbcfb3b048ef5ac70f1: Status 404 returned error can't find the container with id 9debc0910910bdffa19febb29d97284ddee0065be7229bbcfb3b048ef5ac70f1 Apr 17 17:38:02.644400 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:02.644359 2576 generic.go:358] "Generic (PLEG): container finished" podID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerID="bab4ca2474059e4fcefa487a4f913eb01f96a17196abe9a8b97336baa3ec46f0" exitCode=0 Apr 17 17:38:02.644754 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:02.644417 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" event={"ID":"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7","Type":"ContainerDied","Data":"bab4ca2474059e4fcefa487a4f913eb01f96a17196abe9a8b97336baa3ec46f0"} Apr 17 17:38:02.644754 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:02.644446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" event={"ID":"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7","Type":"ContainerStarted","Data":"9debc0910910bdffa19febb29d97284ddee0065be7229bbcfb3b048ef5ac70f1"} Apr 17 17:38:03.650170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:03.650080 2576 generic.go:358] "Generic (PLEG): container finished" podID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerID="c7ec20e4efc148f316a1723a565963d97cb8fa73a646829c0683e3ece442e548" exitCode=0 Apr 17 17:38:03.650170 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:03.650139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" event={"ID":"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7","Type":"ContainerDied","Data":"c7ec20e4efc148f316a1723a565963d97cb8fa73a646829c0683e3ece442e548"} Apr 17 17:38:04.656216 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:04.656174 2576 generic.go:358] "Generic (PLEG): container finished" podID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerID="19f31932ddd8fc7641123449954ab5fdcce970aff9275efa9f830671828bfba9" exitCode=0 Apr 17 17:38:04.656600 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:04.656223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" event={"ID":"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7","Type":"ContainerDied","Data":"19f31932ddd8fc7641123449954ab5fdcce970aff9275efa9f830671828bfba9"} Apr 17 17:38:05.781527 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:05.781504 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:05.957287 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:05.957201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nm9nw\" (UniqueName: \"kubernetes.io/projected/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-kube-api-access-nm9nw\") pod \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " Apr 17 17:38:05.957287 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:05.957273 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-bundle\") pod \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " Apr 17 17:38:05.957472 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:05.957305 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-util\") pod \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\" (UID: \"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7\") " Apr 17 17:38:05.957802 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:05.957772 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-bundle" (OuterVolumeSpecName: "bundle") pod "a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" (UID: "a86c6ec0-510e-4b7f-a2bd-088aef1e26a7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:05.959305 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:05.959279 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-kube-api-access-nm9nw" (OuterVolumeSpecName: "kube-api-access-nm9nw") pod "a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" (UID: "a86c6ec0-510e-4b7f-a2bd-088aef1e26a7"). InnerVolumeSpecName "kube-api-access-nm9nw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:05.963389 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:05.963359 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-util" (OuterVolumeSpecName: "util") pod "a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" (UID: "a86c6ec0-510e-4b7f-a2bd-088aef1e26a7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:06.058222 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:06.058178 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nm9nw\" (UniqueName: \"kubernetes.io/projected/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-kube-api-access-nm9nw\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:38:06.058222 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:06.058215 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-bundle\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:38:06.058222 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:06.058228 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a86c6ec0-510e-4b7f-a2bd-088aef1e26a7-util\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:38:06.665793 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:06.665761 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" Apr 17 17:38:06.665981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:06.665755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350np2bw" event={"ID":"a86c6ec0-510e-4b7f-a2bd-088aef1e26a7","Type":"ContainerDied","Data":"9debc0910910bdffa19febb29d97284ddee0065be7229bbcfb3b048ef5ac70f1"} Apr 17 17:38:06.665981 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:06.665871 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9debc0910910bdffa19febb29d97284ddee0065be7229bbcfb3b048ef5ac70f1" Apr 17 17:38:24.250448 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.250408 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 17 17:38:24.251092 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.250935 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerName="pull" Apr 17 17:38:24.251092 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.250957 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerName="pull" Apr 17 17:38:24.251092 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.250972 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerName="util" Apr 17 17:38:24.251092 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.250981 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerName="util" Apr 17 17:38:24.251092 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.250997 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerName="extract" Apr 17 17:38:24.251092 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.251007 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerName="extract" Apr 17 17:38:24.251416 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.251112 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a86c6ec0-510e-4b7f-a2bd-088aef1e26a7" containerName="extract" Apr 17 17:38:24.254760 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.254736 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 17 17:38:24.257889 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.257862 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 17 17:38:24.258166 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.258150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 17 17:38:24.259183 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.259166 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-n75d4\"" Apr 17 17:38:24.259291 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.259198 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 17 17:38:24.265822 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.265786 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 17 17:38:24.312195 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.312139 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6r2q\" (UniqueName: \"kubernetes.io/projected/60ff100b-ee64-48f3-b626-5b5ccc60c7af-kube-api-access-s6r2q\") pod \"maas-keycloak-0\" (UID: \"60ff100b-ee64-48f3-b626-5b5ccc60c7af\") " pod="keycloak-system/maas-keycloak-0" Apr 17 17:38:24.413165 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.413130 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6r2q\" (UniqueName: \"kubernetes.io/projected/60ff100b-ee64-48f3-b626-5b5ccc60c7af-kube-api-access-s6r2q\") pod \"maas-keycloak-0\" (UID: \"60ff100b-ee64-48f3-b626-5b5ccc60c7af\") " pod="keycloak-system/maas-keycloak-0" Apr 17 17:38:24.423183 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.423157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6r2q\" (UniqueName: \"kubernetes.io/projected/60ff100b-ee64-48f3-b626-5b5ccc60c7af-kube-api-access-s6r2q\") pod \"maas-keycloak-0\" (UID: \"60ff100b-ee64-48f3-b626-5b5ccc60c7af\") " pod="keycloak-system/maas-keycloak-0" Apr 17 17:38:24.564812 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.564772 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 17 17:38:24.700695 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.700663 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 17 17:38:24.702099 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:38:24.702015 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60ff100b_ee64_48f3_b626_5b5ccc60c7af.slice/crio-a93eabb047e3e2f122cba6e52c0f7be9af6f71e182d1e71af9c68be23470fed3 WatchSource:0}: Error finding container a93eabb047e3e2f122cba6e52c0f7be9af6f71e182d1e71af9c68be23470fed3: Status 404 returned error can't find the container with id a93eabb047e3e2f122cba6e52c0f7be9af6f71e182d1e71af9c68be23470fed3 Apr 17 17:38:24.744082 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:24.744045 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"60ff100b-ee64-48f3-b626-5b5ccc60c7af","Type":"ContainerStarted","Data":"a93eabb047e3e2f122cba6e52c0f7be9af6f71e182d1e71af9c68be23470fed3"} Apr 17 17:38:30.773229 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:30.773183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"60ff100b-ee64-48f3-b626-5b5ccc60c7af","Type":"ContainerStarted","Data":"41c48ec3e16a1ec81cd8ac7928d9f1e04e9d1a893dc669a90d3d19852ff0644a"} Apr 17 17:38:30.795555 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:30.795482 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.48855751 podStartE2EDuration="6.795459616s" podCreationTimestamp="2026-04-17 17:38:24 +0000 UTC" firstStartedPulling="2026-04-17 17:38:24.703381753 +0000 UTC m=+731.278730018" lastFinishedPulling="2026-04-17 17:38:30.010283858 +0000 UTC m=+736.585632124" observedRunningTime="2026-04-17 17:38:30.791202306 +0000 UTC m=+737.366550622" watchObservedRunningTime="2026-04-17 17:38:30.795459616 +0000 UTC m=+737.370807906" Apr 17 17:38:31.565490 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:31.565432 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 17 17:38:31.567245 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:31.567208 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:32.565896 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:32.565826 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:33.565525 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:33.565472 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:34.565527 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:34.565478 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 17 17:38:34.567235 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:34.567200 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:35.565561 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:35.565506 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:36.565617 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:36.565561 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:37.565794 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:37.565749 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:38.565955 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:38.565906 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:39.565531 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:39.565470 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:40.565603 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:40.565547 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:41.565366 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:41.565312 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:42.565966 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:42.565918 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.46:9000/health/started\": dial tcp 10.132.0.46:9000: connect: connection refused" Apr 17 17:38:43.679146 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:43.679104 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 17 17:38:43.698587 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:43.698532 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:38:53.685284 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:53.685245 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 17 17:38:55.687344 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:38:55.687302 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:39:24.333238 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:24.333192 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 17 17:39:24.333641 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:24.333444 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" containerID="cri-o://41c48ec3e16a1ec81cd8ac7928d9f1e04e9d1a893dc669a90d3d19852ff0644a" gracePeriod=30 Apr 17 17:39:26.016467 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:26.016426 2576 generic.go:358] "Generic (PLEG): container finished" podID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerID="41c48ec3e16a1ec81cd8ac7928d9f1e04e9d1a893dc669a90d3d19852ff0644a" exitCode=143 Apr 17 17:39:26.016836 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:26.016472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"60ff100b-ee64-48f3-b626-5b5ccc60c7af","Type":"ContainerDied","Data":"41c48ec3e16a1ec81cd8ac7928d9f1e04e9d1a893dc669a90d3d19852ff0644a"} Apr 17 17:39:26.378431 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:26.378404 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:26.480761 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:26.480724 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6r2q\" (UniqueName: \"kubernetes.io/projected/60ff100b-ee64-48f3-b626-5b5ccc60c7af-kube-api-access-s6r2q\") pod \"60ff100b-ee64-48f3-b626-5b5ccc60c7af\" (UID: \"60ff100b-ee64-48f3-b626-5b5ccc60c7af\") " Apr 17 17:39:26.482919 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:26.482886 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ff100b-ee64-48f3-b626-5b5ccc60c7af-kube-api-access-s6r2q" (OuterVolumeSpecName: "kube-api-access-s6r2q") pod "60ff100b-ee64-48f3-b626-5b5ccc60c7af" (UID: "60ff100b-ee64-48f3-b626-5b5ccc60c7af"). InnerVolumeSpecName "kube-api-access-s6r2q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:39:26.582205 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:26.582114 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s6r2q\" (UniqueName: \"kubernetes.io/projected/60ff100b-ee64-48f3-b626-5b5ccc60c7af-kube-api-access-s6r2q\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:39:27.021464 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.021432 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:27.021962 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.021453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"60ff100b-ee64-48f3-b626-5b5ccc60c7af","Type":"ContainerDied","Data":"a93eabb047e3e2f122cba6e52c0f7be9af6f71e182d1e71af9c68be23470fed3"} Apr 17 17:39:27.021962 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.021512 2576 scope.go:117] "RemoveContainer" containerID="41c48ec3e16a1ec81cd8ac7928d9f1e04e9d1a893dc669a90d3d19852ff0644a" Apr 17 17:39:27.045132 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.045097 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 17 17:39:27.054784 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.054751 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 17 17:39:27.078193 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.078156 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 17 17:39:27.078534 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.078520 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" Apr 17 17:39:27.078590 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.078537 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" Apr 17 17:39:27.078625 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.078604 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" containerName="keycloak" Apr 17 17:39:27.082896 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.082876 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:27.085777 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.085751 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-n75d4\"" Apr 17 17:39:27.085903 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.085811 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 17 17:39:27.085903 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.085815 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 17 17:39:27.085903 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.085753 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 17 17:39:27.085903 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.085863 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 17 17:39:27.091434 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.091410 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 17 17:39:27.186828 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.186786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88r8j\" (UniqueName: \"kubernetes.io/projected/491681bd-d9e1-4fec-baac-1cb92b67e3a8-kube-api-access-88r8j\") pod \"maas-keycloak-0\" (UID: \"491681bd-d9e1-4fec-baac-1cb92b67e3a8\") " pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:27.187000 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.186871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/491681bd-d9e1-4fec-baac-1cb92b67e3a8-test-realms\") pod \"maas-keycloak-0\" (UID: \"491681bd-d9e1-4fec-baac-1cb92b67e3a8\") " pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:27.287908 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.287822 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88r8j\" (UniqueName: \"kubernetes.io/projected/491681bd-d9e1-4fec-baac-1cb92b67e3a8-kube-api-access-88r8j\") pod \"maas-keycloak-0\" (UID: \"491681bd-d9e1-4fec-baac-1cb92b67e3a8\") " pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:27.287908 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.287892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/491681bd-d9e1-4fec-baac-1cb92b67e3a8-test-realms\") pod \"maas-keycloak-0\" (UID: \"491681bd-d9e1-4fec-baac-1cb92b67e3a8\") " pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:27.288618 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.288598 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/491681bd-d9e1-4fec-baac-1cb92b67e3a8-test-realms\") pod \"maas-keycloak-0\" (UID: \"491681bd-d9e1-4fec-baac-1cb92b67e3a8\") " pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:27.296756 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.296726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88r8j\" (UniqueName: \"kubernetes.io/projected/491681bd-d9e1-4fec-baac-1cb92b67e3a8-kube-api-access-88r8j\") pod \"maas-keycloak-0\" (UID: \"491681bd-d9e1-4fec-baac-1cb92b67e3a8\") " pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:27.393604 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.393567 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:27.521887 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:27.521861 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 17 17:39:27.523746 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:39:27.523714 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491681bd_d9e1_4fec_baac_1cb92b67e3a8.slice/crio-883356e9786a778002bbcdb1edb6ff7be004b1444fc812c7a5b8a953499b0d7f WatchSource:0}: Error finding container 883356e9786a778002bbcdb1edb6ff7be004b1444fc812c7a5b8a953499b0d7f: Status 404 returned error can't find the container with id 883356e9786a778002bbcdb1edb6ff7be004b1444fc812c7a5b8a953499b0d7f Apr 17 17:39:28.005675 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:28.005296 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ff100b-ee64-48f3-b626-5b5ccc60c7af" path="/var/lib/kubelet/pods/60ff100b-ee64-48f3-b626-5b5ccc60c7af/volumes" Apr 17 17:39:28.026864 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:28.026817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"491681bd-d9e1-4fec-baac-1cb92b67e3a8","Type":"ContainerStarted","Data":"053539ddf557494ecb3d21a2800b0fd9c20033a80e6de493dadcf6c4da20599b"} Apr 17 17:39:28.026864 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:28.026868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"491681bd-d9e1-4fec-baac-1cb92b67e3a8","Type":"ContainerStarted","Data":"883356e9786a778002bbcdb1edb6ff7be004b1444fc812c7a5b8a953499b0d7f"} Apr 17 17:39:28.046278 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:28.046192 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=0.751088746 podStartE2EDuration="1.046173632s" podCreationTimestamp="2026-04-17 17:39:27 +0000 UTC" firstStartedPulling="2026-04-17 17:39:27.525066014 +0000 UTC m=+794.100414283" lastFinishedPulling="2026-04-17 17:39:27.820150891 +0000 UTC m=+794.395499169" observedRunningTime="2026-04-17 17:39:28.045518605 +0000 UTC m=+794.620866902" watchObservedRunningTime="2026-04-17 17:39:28.046173632 +0000 UTC m=+794.621521923" Apr 17 17:39:28.394593 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:28.394543 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:28.396485 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:28.396446 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:29.394734 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:29.394682 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:30.395108 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:30.395016 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:31.394669 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:31.394613 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:32.395120 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:32.395070 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:33.394186 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:33.394125 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:34.394510 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:34.394458 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:35.394286 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:35.394230 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:36.394984 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:36.394940 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:37.393849 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:37.393803 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:37.394188 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:37.394147 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:38.394072 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:38.393995 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:39.394533 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:39.394482 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:40.394672 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:40.394625 2576 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.47:9000/health/started\": dial tcp 10.132.0.47:9000: connect: connection refused" Apr 17 17:39:41.501163 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:41.501119 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 17 17:39:41.516546 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:41.516499 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:39:51.505394 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:39:51.505341 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="491681bd-d9e1-4fec-baac-1cb92b67e3a8" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:40:01.506617 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:40:01.506575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 17 17:40:19.601794 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:40:19.601747 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:41:04.091865 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:41:04.091778 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:41:13.969843 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:41:13.969812 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:41:13.970748 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:41:13.970715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:41:13.978167 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:41:13.978127 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:41:13.978362 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:41:13.978256 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:41:17.686385 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:41:17.686352 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:41:22.783540 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:41:22.783499 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:41:47.101462 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:41:47.101425 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:42:28.885256 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:42:28.885216 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:42:37.584008 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:42:37.583925 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:43:08.906386 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:43:08.906351 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:43:24.294418 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:43:24.294379 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:44:02.785255 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:44:02.785151 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:44:19.881949 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:44:19.881908 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:44:33.794786 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:44:33.794746 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:44:49.783035 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:44:49.782993 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:45:00.149902 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.149868 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607465-5zcnx"] Apr 17 17:45:00.153398 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.153378 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" Apr 17 17:45:00.156588 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.156555 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-gl78k\"" Apr 17 17:45:00.171393 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.171360 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607465-5zcnx"] Apr 17 17:45:00.292853 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.292810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vchw2\" (UniqueName: \"kubernetes.io/projected/f3023392-15e7-4519-a597-c81c55e0c715-kube-api-access-vchw2\") pod \"maas-api-key-cleanup-29607465-5zcnx\" (UID: \"f3023392-15e7-4519-a597-c81c55e0c715\") " pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" Apr 17 17:45:00.394210 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.394167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vchw2\" (UniqueName: \"kubernetes.io/projected/f3023392-15e7-4519-a597-c81c55e0c715-kube-api-access-vchw2\") pod \"maas-api-key-cleanup-29607465-5zcnx\" (UID: \"f3023392-15e7-4519-a597-c81c55e0c715\") " pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" Apr 17 17:45:00.403806 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.403738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vchw2\" (UniqueName: \"kubernetes.io/projected/f3023392-15e7-4519-a597-c81c55e0c715-kube-api-access-vchw2\") pod \"maas-api-key-cleanup-29607465-5zcnx\" (UID: \"f3023392-15e7-4519-a597-c81c55e0c715\") " pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" Apr 17 17:45:00.464412 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.464363 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" Apr 17 17:45:00.798206 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.798167 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607465-5zcnx"] Apr 17 17:45:00.799422 ip-10-0-137-109 kubenswrapper[2576]: W0417 17:45:00.799396 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3023392_15e7_4519_a597_c81c55e0c715.slice/crio-4662b0a2b77347d37d3e17bc7ccd3439cf6b450c19e3c5745ffac7399532eb3f WatchSource:0}: Error finding container 4662b0a2b77347d37d3e17bc7ccd3439cf6b450c19e3c5745ffac7399532eb3f: Status 404 returned error can't find the container with id 4662b0a2b77347d37d3e17bc7ccd3439cf6b450c19e3c5745ffac7399532eb3f Apr 17 17:45:00.801110 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:00.801094 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:45:01.327961 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:01.327927 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" event={"ID":"f3023392-15e7-4519-a597-c81c55e0c715","Type":"ContainerStarted","Data":"4662b0a2b77347d37d3e17bc7ccd3439cf6b450c19e3c5745ffac7399532eb3f"} Apr 17 17:45:04.341788 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:04.341749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" event={"ID":"f3023392-15e7-4519-a597-c81c55e0c715","Type":"ContainerStarted","Data":"a9060730bbd00220afc275056870cbfb458f5520a3d92f19efc056a8ed6dfa86"} Apr 17 17:45:04.358926 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:04.358870 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" podStartSLOduration=1.66157523 podStartE2EDuration="4.358852585s" podCreationTimestamp="2026-04-17 17:45:00 +0000 UTC" firstStartedPulling="2026-04-17 17:45:00.801232436 +0000 UTC m=+1127.376580702" lastFinishedPulling="2026-04-17 17:45:03.49850979 +0000 UTC m=+1130.073858057" observedRunningTime="2026-04-17 17:45:04.358200662 +0000 UTC m=+1130.933548950" watchObservedRunningTime="2026-04-17 17:45:04.358852585 +0000 UTC m=+1130.934200873" Apr 17 17:45:24.424060 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:24.423951 2576 generic.go:358] "Generic (PLEG): container finished" podID="f3023392-15e7-4519-a597-c81c55e0c715" containerID="a9060730bbd00220afc275056870cbfb458f5520a3d92f19efc056a8ed6dfa86" exitCode=6 Apr 17 17:45:24.424060 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:24.424052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" event={"ID":"f3023392-15e7-4519-a597-c81c55e0c715","Type":"ContainerDied","Data":"a9060730bbd00220afc275056870cbfb458f5520a3d92f19efc056a8ed6dfa86"} Apr 17 17:45:24.424495 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:24.424404 2576 scope.go:117] "RemoveContainer" containerID="a9060730bbd00220afc275056870cbfb458f5520a3d92f19efc056a8ed6dfa86" Apr 17 17:45:25.430073 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:25.430001 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" event={"ID":"f3023392-15e7-4519-a597-c81c55e0c715","Type":"ContainerStarted","Data":"634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12"} Apr 17 17:45:41.992153 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:41.992071 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:45:45.510226 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:45.510186 2576 generic.go:358] "Generic (PLEG): container finished" podID="f3023392-15e7-4519-a597-c81c55e0c715" containerID="634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12" exitCode=6 Apr 17 17:45:45.510660 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:45.510256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" event={"ID":"f3023392-15e7-4519-a597-c81c55e0c715","Type":"ContainerDied","Data":"634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12"} Apr 17 17:45:45.510660 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:45.510299 2576 scope.go:117] "RemoveContainer" containerID="a9060730bbd00220afc275056870cbfb458f5520a3d92f19efc056a8ed6dfa86" Apr 17 17:45:45.510660 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:45.510594 2576 scope.go:117] "RemoveContainer" containerID="634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12" Apr 17 17:45:45.510836 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:45:45.510817 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607465-5zcnx_opendatahub(f3023392-15e7-4519-a597-c81c55e0c715)\"" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" podUID="f3023392-15e7-4519-a597-c81c55e0c715" Apr 17 17:45:52.126730 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:52.126686 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:45:56.996873 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:56.996834 2576 scope.go:117] "RemoveContainer" containerID="634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12" Apr 17 17:45:57.566269 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:57.566166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" event={"ID":"f3023392-15e7-4519-a597-c81c55e0c715","Type":"ContainerStarted","Data":"76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43"} Apr 17 17:45:58.044695 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:58.044661 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607465-5zcnx"] Apr 17 17:45:58.570425 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:45:58.570359 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" containerID="cri-o://76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43" gracePeriod=30 Apr 17 17:46:08.078365 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:08.078317 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:46:14.004824 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:14.004792 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:46:14.006286 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:14.006263 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:46:14.011404 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:14.011379 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:46:14.012847 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:14.012828 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:46:16.786822 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:16.786771 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:46:17.819759 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:17.819735 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" Apr 17 17:46:17.863604 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:17.863518 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vchw2\" (UniqueName: \"kubernetes.io/projected/f3023392-15e7-4519-a597-c81c55e0c715-kube-api-access-vchw2\") pod \"f3023392-15e7-4519-a597-c81c55e0c715\" (UID: \"f3023392-15e7-4519-a597-c81c55e0c715\") " Apr 17 17:46:17.865798 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:17.865755 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3023392-15e7-4519-a597-c81c55e0c715-kube-api-access-vchw2" (OuterVolumeSpecName: "kube-api-access-vchw2") pod "f3023392-15e7-4519-a597-c81c55e0c715" (UID: "f3023392-15e7-4519-a597-c81c55e0c715"). InnerVolumeSpecName "kube-api-access-vchw2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:46:17.965112 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:17.965073 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vchw2\" (UniqueName: \"kubernetes.io/projected/f3023392-15e7-4519-a597-c81c55e0c715-kube-api-access-vchw2\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 17:46:18.649579 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.649537 2576 generic.go:358] "Generic (PLEG): container finished" podID="f3023392-15e7-4519-a597-c81c55e0c715" containerID="76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43" exitCode=6 Apr 17 17:46:18.649781 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.649631 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" event={"ID":"f3023392-15e7-4519-a597-c81c55e0c715","Type":"ContainerDied","Data":"76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43"} Apr 17 17:46:18.649781 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.649685 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" event={"ID":"f3023392-15e7-4519-a597-c81c55e0c715","Type":"ContainerDied","Data":"4662b0a2b77347d37d3e17bc7ccd3439cf6b450c19e3c5745ffac7399532eb3f"} Apr 17 17:46:18.649781 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.649706 2576 scope.go:117] "RemoveContainer" containerID="76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43" Apr 17 17:46:18.649781 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.649646 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607465-5zcnx" Apr 17 17:46:18.658774 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.658754 2576 scope.go:117] "RemoveContainer" containerID="634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12" Apr 17 17:46:18.667086 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.667056 2576 scope.go:117] "RemoveContainer" containerID="76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43" Apr 17 17:46:18.667393 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:46:18.667374 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43\": container with ID starting with 76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43 not found: ID does not exist" containerID="76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43" Apr 17 17:46:18.667454 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.667412 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43"} err="failed to get container status \"76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43\": rpc error: code = NotFound desc = could not find container \"76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43\": container with ID starting with 76154505d159786bf57ea5dc57c48a7b8900b638a4e7aad81afdc5664581ff43 not found: ID does not exist" Apr 17 17:46:18.667454 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.667431 2576 scope.go:117] "RemoveContainer" containerID="634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12" Apr 17 17:46:18.667727 ip-10-0-137-109 kubenswrapper[2576]: E0417 17:46:18.667704 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12\": container with ID starting with 634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12 not found: ID does not exist" containerID="634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12" Apr 17 17:46:18.667776 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.667734 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12"} err="failed to get container status \"634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12\": rpc error: code = NotFound desc = could not find container \"634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12\": container with ID starting with 634621400f73b437828c9e38ed1302f53234806eab660f0d5bfa5bdc8aba2e12 not found: ID does not exist" Apr 17 17:46:18.671546 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.671514 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607465-5zcnx"] Apr 17 17:46:18.676115 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:18.676087 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607465-5zcnx"] Apr 17 17:46:20.001844 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:20.001807 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3023392-15e7-4519-a597-c81c55e0c715" path="/var/lib/kubelet/pods/f3023392-15e7-4519-a597-c81c55e0c715/volumes" Apr 17 17:46:34.281234 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:34.281194 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:46:41.180820 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:46:41.180769 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:47:14.682512 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:47:14.682471 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:47:22.787756 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:47:22.787719 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:47:31.184574 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:47:31.184533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:47:39.790465 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:47:39.787279 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:47:48.185422 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:47:48.185383 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:48:05.489504 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:48:05.489466 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:48:15.677227 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:48:15.677185 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:49:03.288202 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:49:03.288116 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:49:11.883751 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:49:11.883709 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:49:21.089772 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:49:21.089729 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:49:28.983777 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:49:28.983736 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:49:38.980684 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:49:38.980642 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:49:47.183722 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:49:47.183681 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:49:55.677539 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:49:55.677501 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:50:00.183613 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:50:00.183573 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:50:03.982499 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:50:03.982460 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:50:13.083557 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:50:13.083514 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:50:22.496486 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:50:22.496436 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:50:30.930670 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:50:30.930629 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:50:39.886791 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:50:39.886752 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:50:48.888275 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:50:48.888236 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:50:57.286900 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:50:57.286865 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:51:06.287660 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:51:06.287625 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:51:14.047134 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:51:14.047102 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:51:14.049548 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:51:14.049513 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:51:14.054178 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:51:14.054155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:51:14.056707 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:51:14.056687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:51:14.089414 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:51:14.089376 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:51:23.384398 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:51:23.384359 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:51:32.001857 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:51:32.001764 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:52:42.183420 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:52:42.183383 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:52:48.882570 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:52:48.882530 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:52:59.084256 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:52:59.084217 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:53:29.480342 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:53:29.480305 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:54:12.089129 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:54:12.089088 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:54:20.583984 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:54:20.583944 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:54:29.292673 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:54:29.292637 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:54:37.184938 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:54:37.184854 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:54:46.198650 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:54:46.198613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:54:55.080131 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:54:55.080090 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:55:03.291699 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:55:03.291661 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:55:08.489783 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:55:08.489732 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:55:18.185343 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:55:18.185305 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:55:26.783188 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:55:26.783145 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:55:35.281819 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:55:35.281776 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:55:43.081303 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:55:43.081265 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:56:00.184002 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:00.183964 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:56:08.782521 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:08.782477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:56:14.078281 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:14.078252 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:56:14.082798 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:14.082768 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 17:56:14.085333 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:14.085311 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:56:14.090199 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:14.090177 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 17:56:17.895359 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:17.895318 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:56:26.085837 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:26.085793 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:56:43.022425 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:43.022384 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:56:51.387865 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:56:51.387827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:57:00.983731 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:57:00.983696 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:57:08.691807 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:57:08.691767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:57:18.089819 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:57:18.089777 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:57:26.390515 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:57:26.390475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:57:34.683456 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:57:34.683372 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:57:51.893354 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:57:51.893316 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:58:01.289773 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:58:01.289733 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:58:17.386630 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:58:17.386589 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:58:26.393759 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:58:26.393711 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:58:35.283575 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:58:35.283536 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:58:43.088318 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:58:43.088279 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:58:51.085571 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:58:51.085524 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:59:07.307785 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:59:07.307701 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:59:16.255662 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:59:16.255626 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:59:24.437872 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:59:24.437825 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:59:32.999378 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:59:32.999342 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 17:59:56.205048 ip-10-0-137-109 kubenswrapper[2576]: I0417 17:59:56.204991 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 18:00:00.147444 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.147406 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607480-t7tft"] Apr 17 18:00:00.147830 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.147754 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" Apr 17 18:00:00.147830 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.147765 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" Apr 17 18:00:00.147830 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.147779 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" Apr 17 18:00:00.147830 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.147785 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" Apr 17 18:00:00.147830 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.147793 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" Apr 17 18:00:00.147830 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.147799 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" Apr 17 18:00:00.148043 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.147865 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" Apr 17 18:00:00.148043 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.147874 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" Apr 17 18:00:00.150735 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.150717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" Apr 17 18:00:00.153693 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.153671 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-gl78k\"" Apr 17 18:00:00.171417 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.171383 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607480-t7tft"] Apr 17 18:00:00.258620 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.258558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj77t\" (UniqueName: \"kubernetes.io/projected/a3fbe75c-e853-4ff8-819a-022c2ef450fa-kube-api-access-qj77t\") pod \"maas-api-key-cleanup-29607480-t7tft\" (UID: \"a3fbe75c-e853-4ff8-819a-022c2ef450fa\") " pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" Apr 17 18:00:00.359114 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.359077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj77t\" (UniqueName: \"kubernetes.io/projected/a3fbe75c-e853-4ff8-819a-022c2ef450fa-kube-api-access-qj77t\") pod \"maas-api-key-cleanup-29607480-t7tft\" (UID: \"a3fbe75c-e853-4ff8-819a-022c2ef450fa\") " pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" Apr 17 18:00:00.368227 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.368195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj77t\" (UniqueName: \"kubernetes.io/projected/a3fbe75c-e853-4ff8-819a-022c2ef450fa-kube-api-access-qj77t\") pod \"maas-api-key-cleanup-29607480-t7tft\" (UID: \"a3fbe75c-e853-4ff8-819a-022c2ef450fa\") " pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" Apr 17 18:00:00.461191 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.461101 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" Apr 17 18:00:00.593400 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.593373 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607480-t7tft"] Apr 17 18:00:00.595096 ip-10-0-137-109 kubenswrapper[2576]: W0417 18:00:00.595068 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3fbe75c_e853_4ff8_819a_022c2ef450fa.slice/crio-3f2cafdfdc56fd0e58db19d960007650113390a1863cbef1e77a16909e32cfa8 WatchSource:0}: Error finding container 3f2cafdfdc56fd0e58db19d960007650113390a1863cbef1e77a16909e32cfa8: Status 404 returned error can't find the container with id 3f2cafdfdc56fd0e58db19d960007650113390a1863cbef1e77a16909e32cfa8 Apr 17 18:00:00.596724 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.596707 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:00:00.829772 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:00.829738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" event={"ID":"a3fbe75c-e853-4ff8-819a-022c2ef450fa","Type":"ContainerStarted","Data":"3f2cafdfdc56fd0e58db19d960007650113390a1863cbef1e77a16909e32cfa8"} Apr 17 18:00:01.841227 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:01.841186 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" event={"ID":"a3fbe75c-e853-4ff8-819a-022c2ef450fa","Type":"ContainerStarted","Data":"f9da3cb1ddf7489e6c8c9619d76923a533070178ff245e075a35613729f193f7"} Apr 17 18:00:01.860248 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:01.860194 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" podStartSLOduration=1.860178043 podStartE2EDuration="1.860178043s" podCreationTimestamp="2026-04-17 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:00:01.857910155 +0000 UTC m=+2028.433258445" watchObservedRunningTime="2026-04-17 18:00:01.860178043 +0000 UTC m=+2028.435526330" Apr 17 18:00:09.300325 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:09.300290 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-82tgf"] Apr 17 18:00:21.922198 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:21.922159 2576 generic.go:358] "Generic (PLEG): container finished" podID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerID="f9da3cb1ddf7489e6c8c9619d76923a533070178ff245e075a35613729f193f7" exitCode=6 Apr 17 18:00:21.922609 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:21.922220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" event={"ID":"a3fbe75c-e853-4ff8-819a-022c2ef450fa","Type":"ContainerDied","Data":"f9da3cb1ddf7489e6c8c9619d76923a533070178ff245e075a35613729f193f7"} Apr 17 18:00:21.922609 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:21.922584 2576 scope.go:117] "RemoveContainer" containerID="f9da3cb1ddf7489e6c8c9619d76923a533070178ff245e075a35613729f193f7" Apr 17 18:00:22.927338 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:22.927301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" event={"ID":"a3fbe75c-e853-4ff8-819a-022c2ef450fa","Type":"ContainerStarted","Data":"cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579"} Apr 17 18:00:43.024112 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:43.024072 2576 generic.go:358] "Generic (PLEG): container finished" podID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerID="cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579" exitCode=6 Apr 17 18:00:43.024589 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:43.024142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" event={"ID":"a3fbe75c-e853-4ff8-819a-022c2ef450fa","Type":"ContainerDied","Data":"cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579"} Apr 17 18:00:43.024589 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:43.024191 2576 scope.go:117] "RemoveContainer" containerID="f9da3cb1ddf7489e6c8c9619d76923a533070178ff245e075a35613729f193f7" Apr 17 18:00:43.024589 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:43.024461 2576 scope.go:117] "RemoveContainer" containerID="cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579" Apr 17 18:00:43.024736 ip-10-0-137-109 kubenswrapper[2576]: E0417 18:00:43.024705 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607480-t7tft_opendatahub(a3fbe75c-e853-4ff8-819a-022c2ef450fa)\"" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" podUID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" Apr 17 18:00:54.000001 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:53.999963 2576 scope.go:117] "RemoveContainer" containerID="cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579" Apr 17 18:00:55.074980 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:55.074937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" event={"ID":"a3fbe75c-e853-4ff8-819a-022c2ef450fa","Type":"ContainerStarted","Data":"cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae"} Apr 17 18:00:56.104166 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:56.104129 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607480-t7tft"] Apr 17 18:00:56.104582 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:00:56.104340 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" podUID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerName="cleanup" containerID="cri-o://cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae" gracePeriod=30 Apr 17 18:01:14.116899 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:14.116865 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 18:01:14.124123 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:14.124096 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 18:01:14.124576 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:14.124553 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 18:01:14.130751 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:14.130726 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 18:01:14.641749 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:14.641723 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" Apr 17 18:01:14.804541 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:14.804493 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj77t\" (UniqueName: \"kubernetes.io/projected/a3fbe75c-e853-4ff8-819a-022c2ef450fa-kube-api-access-qj77t\") pod \"a3fbe75c-e853-4ff8-819a-022c2ef450fa\" (UID: \"a3fbe75c-e853-4ff8-819a-022c2ef450fa\") " Apr 17 18:01:14.806688 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:14.806646 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3fbe75c-e853-4ff8-819a-022c2ef450fa-kube-api-access-qj77t" (OuterVolumeSpecName: "kube-api-access-qj77t") pod "a3fbe75c-e853-4ff8-819a-022c2ef450fa" (UID: "a3fbe75c-e853-4ff8-819a-022c2ef450fa"). InnerVolumeSpecName "kube-api-access-qj77t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:01:14.905759 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:14.905721 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qj77t\" (UniqueName: \"kubernetes.io/projected/a3fbe75c-e853-4ff8-819a-022c2ef450fa-kube-api-access-qj77t\") on node \"ip-10-0-137-109.ec2.internal\" DevicePath \"\"" Apr 17 18:01:15.154395 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.154304 2576 generic.go:358] "Generic (PLEG): container finished" podID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerID="cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae" exitCode=6 Apr 17 18:01:15.154395 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.154380 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" Apr 17 18:01:15.154395 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.154386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" event={"ID":"a3fbe75c-e853-4ff8-819a-022c2ef450fa","Type":"ContainerDied","Data":"cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae"} Apr 17 18:01:15.154910 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.154428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607480-t7tft" event={"ID":"a3fbe75c-e853-4ff8-819a-022c2ef450fa","Type":"ContainerDied","Data":"3f2cafdfdc56fd0e58db19d960007650113390a1863cbef1e77a16909e32cfa8"} Apr 17 18:01:15.154910 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.154449 2576 scope.go:117] "RemoveContainer" containerID="cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae" Apr 17 18:01:15.164236 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.164218 2576 scope.go:117] "RemoveContainer" containerID="cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579" Apr 17 18:01:15.172572 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.172549 2576 scope.go:117] "RemoveContainer" containerID="cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae" Apr 17 18:01:15.172852 ip-10-0-137-109 kubenswrapper[2576]: E0417 18:01:15.172833 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae\": container with ID starting with cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae not found: ID does not exist" containerID="cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae" Apr 17 18:01:15.172933 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.172860 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae"} err="failed to get container status \"cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae\": rpc error: code = NotFound desc = could not find container \"cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae\": container with ID starting with cbb59879c7c345e46f5b2c0f32e6f25c704f4e01872675a8587ee3bf38933dae not found: ID does not exist" Apr 17 18:01:15.172933 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.172880 2576 scope.go:117] "RemoveContainer" containerID="cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579" Apr 17 18:01:15.173170 ip-10-0-137-109 kubenswrapper[2576]: E0417 18:01:15.173149 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579\": container with ID starting with cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579 not found: ID does not exist" containerID="cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579" Apr 17 18:01:15.173223 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.173179 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579"} err="failed to get container status \"cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579\": rpc error: code = NotFound desc = could not find container \"cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579\": container with ID starting with cf3d5d14d3adfb4660eb5f83aa3ad2ac787660ed0e0b1dfadf669d715ae2f579 not found: ID does not exist" Apr 17 18:01:15.177491 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.177462 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607480-t7tft"] Apr 17 18:01:15.181544 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:15.181515 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607480-t7tft"] Apr 17 18:01:16.001845 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:01:16.001810 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" path="/var/lib/kubelet/pods/a3fbe75c-e853-4ff8-819a-022c2ef450fa/volumes" Apr 17 18:02:21.339296 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:21.339264 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-77fb85d776-4gb5w_db0b2ad3-0434-48fc-b7d7-385e8237ba41/manager/0.log" Apr 17 18:02:22.462568 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.462541 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr_1b7832f4-4a06-4efa-8bdb-8a8986f6002d/util/0.log" Apr 17 18:02:22.477473 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.477445 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr_1b7832f4-4a06-4efa-8bdb-8a8986f6002d/pull/0.log" Apr 17 18:02:22.491586 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.491547 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr_1b7832f4-4a06-4efa-8bdb-8a8986f6002d/extract/0.log" Apr 17 18:02:22.610673 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.610641 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x_b98faa71-de81-49c6-a9e7-add858e6a506/extract/0.log" Apr 17 18:02:22.622348 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.622320 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x_b98faa71-de81-49c6-a9e7-add858e6a506/util/0.log" Apr 17 18:02:22.635658 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.635632 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x_b98faa71-de81-49c6-a9e7-add858e6a506/pull/0.log" Apr 17 18:02:22.751998 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.751925 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj_febceaca-a96e-48ab-89af-43744615033b/util/0.log" Apr 17 18:02:22.759334 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.759309 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj_febceaca-a96e-48ab-89af-43744615033b/pull/0.log" Apr 17 18:02:22.766319 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.766303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj_febceaca-a96e-48ab-89af-43744615033b/extract/0.log" Apr 17 18:02:22.876525 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.876492 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g_7be7913a-73b8-4554-a4a7-945a2dab520d/util/0.log" Apr 17 18:02:22.883640 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.883613 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g_7be7913a-73b8-4554-a4a7-945a2dab520d/pull/0.log" Apr 17 18:02:22.890810 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:22.890787 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g_7be7913a-73b8-4554-a4a7-945a2dab520d/extract/0.log" Apr 17 18:02:23.371125 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:23.371094 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-gq5h2_30559ecc-aaea-40bd-a0ec-1b5a8131ed89/kuadrant-console-plugin/0.log" Apr 17 18:02:23.745376 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:23.745294 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-82tgf_df99ae14-b508-4fd2-a73b-404f8ddf12c5/limitador/0.log" Apr 17 18:02:24.459908 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:24.459878 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-598f578945-4x886_341f200d-f57e-4466-b96c-0b3d9a8d03ff/kube-auth-proxy/0.log" Apr 17 18:02:32.531037 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:32.530989 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dhvm5_e8cb7f89-2b7c-45fe-9d10-6a4afcec6700/global-pull-secret-syncer/0.log" Apr 17 18:02:32.658784 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:32.658748 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xn5kt_d1855494-351c-453b-b8fb-c603cdf8d73a/konnectivity-agent/0.log" Apr 17 18:02:32.682390 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:32.682359 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-109.ec2.internal_2a191cd1b01780dbedd511a0221fb1c6/haproxy/0.log" Apr 17 18:02:35.951982 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:35.951953 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr_1b7832f4-4a06-4efa-8bdb-8a8986f6002d/extract/0.log" Apr 17 18:02:35.986638 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:35.986609 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr_1b7832f4-4a06-4efa-8bdb-8a8986f6002d/util/0.log" Apr 17 18:02:36.025973 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.025946 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kqsnr_1b7832f4-4a06-4efa-8bdb-8a8986f6002d/pull/0.log" Apr 17 18:02:36.055588 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.055557 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x_b98faa71-de81-49c6-a9e7-add858e6a506/extract/0.log" Apr 17 18:02:36.092509 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.092472 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x_b98faa71-de81-49c6-a9e7-add858e6a506/util/0.log" Apr 17 18:02:36.125910 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.125883 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0vd62x_b98faa71-de81-49c6-a9e7-add858e6a506/pull/0.log" Apr 17 18:02:36.161300 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.161274 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj_febceaca-a96e-48ab-89af-43744615033b/extract/0.log" Apr 17 18:02:36.193618 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.193580 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj_febceaca-a96e-48ab-89af-43744615033b/util/0.log" Apr 17 18:02:36.231052 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.230950 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed734l8cj_febceaca-a96e-48ab-89af-43744615033b/pull/0.log" Apr 17 18:02:36.285164 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.285138 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g_7be7913a-73b8-4554-a4a7-945a2dab520d/extract/0.log" Apr 17 18:02:36.322573 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.322541 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g_7be7913a-73b8-4554-a4a7-945a2dab520d/util/0.log" Apr 17 18:02:36.354265 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.354233 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef19s54g_7be7913a-73b8-4554-a4a7-945a2dab520d/pull/0.log" Apr 17 18:02:36.741648 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.741620 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-gq5h2_30559ecc-aaea-40bd-a0ec-1b5a8131ed89/kuadrant-console-plugin/0.log" Apr 17 18:02:36.930583 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:36.930554 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-82tgf_df99ae14-b508-4fd2-a73b-404f8ddf12c5/limitador/0.log" Apr 17 18:02:38.556424 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:38.556397 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tf5x4_93f4c3f7-38c9-4335-a216-1b515e0a943a/kube-state-metrics/0.log" Apr 17 18:02:38.578303 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:38.578276 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tf5x4_93f4c3f7-38c9-4335-a216-1b515e0a943a/kube-rbac-proxy-main/0.log" Apr 17 18:02:38.602400 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:38.602370 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tf5x4_93f4c3f7-38c9-4335-a216-1b515e0a943a/kube-rbac-proxy-self/0.log" Apr 17 18:02:38.637330 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:38.637299 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-68cbf949c5-kvvxc_9c9abd61-e722-4556-8959-44e58ae5fa17/metrics-server/0.log" Apr 17 18:02:38.972846 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:38.972815 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zwbtc_a3865d5e-8fda-423f-8730-6bbaa0c85e25/node-exporter/0.log" Apr 17 18:02:39.013128 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:39.013097 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zwbtc_a3865d5e-8fda-423f-8730-6bbaa0c85e25/kube-rbac-proxy/0.log" Apr 17 18:02:39.060349 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:39.060321 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zwbtc_a3865d5e-8fda-423f-8730-6bbaa0c85e25/init-textfile/0.log" Apr 17 18:02:41.255869 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.255836 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl"] Apr 17 18:02:41.256359 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.256212 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerName="cleanup" Apr 17 18:02:41.256359 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.256226 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerName="cleanup" Apr 17 18:02:41.256359 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.256234 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerName="cleanup" Apr 17 18:02:41.256359 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.256240 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerName="cleanup" Apr 17 18:02:41.256359 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.256308 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3023392-15e7-4519-a597-c81c55e0c715" containerName="cleanup" Apr 17 18:02:41.256359 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.256318 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerName="cleanup" Apr 17 18:02:41.256359 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.256326 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3fbe75c-e853-4ff8-819a-022c2ef450fa" containerName="cleanup" Apr 17 18:02:41.260566 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.260547 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.263431 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.263407 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8rk54\"/\"default-dockercfg-tkwzj\"" Apr 17 18:02:41.263576 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.263468 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8rk54\"/\"openshift-service-ca.crt\"" Apr 17 18:02:41.264694 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.264672 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8rk54\"/\"kube-root-ca.crt\"" Apr 17 18:02:41.269591 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.269560 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl"] Apr 17 18:02:41.327598 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.327545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-sys\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.327598 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.327589 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-proc\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.327830 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.327625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2vt\" (UniqueName: \"kubernetes.io/projected/698360dd-7a63-45bf-81ca-2f83356de9a7-kube-api-access-8b2vt\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.327830 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.327655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-lib-modules\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.327830 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.327689 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-podres\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.428794 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.428755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-sys\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.428794 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.428792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-proc\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.429098 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.428824 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2vt\" (UniqueName: \"kubernetes.io/projected/698360dd-7a63-45bf-81ca-2f83356de9a7-kube-api-access-8b2vt\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.429098 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.428853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-lib-modules\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.429098 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.428882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-sys\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.429098 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.428889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-proc\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.429098 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.428891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-podres\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.429098 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.428996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-lib-modules\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.429098 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.429008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/698360dd-7a63-45bf-81ca-2f83356de9a7-podres\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.438554 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.438528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2vt\" (UniqueName: \"kubernetes.io/projected/698360dd-7a63-45bf-81ca-2f83356de9a7-kube-api-access-8b2vt\") pod \"perf-node-gather-daemonset-bsmjl\" (UID: \"698360dd-7a63-45bf-81ca-2f83356de9a7\") " pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.571955 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.571866 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:41.714411 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.714381 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl"] Apr 17 18:02:41.716843 ip-10-0-137-109 kubenswrapper[2576]: W0417 18:02:41.716806 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod698360dd_7a63_45bf_81ca_2f83356de9a7.slice/crio-9471d14f6af3fa3b93a53e40ca842e676275dea053ec16ac1b1b6cc1241ff620 WatchSource:0}: Error finding container 9471d14f6af3fa3b93a53e40ca842e676275dea053ec16ac1b1b6cc1241ff620: Status 404 returned error can't find the container with id 9471d14f6af3fa3b93a53e40ca842e676275dea053ec16ac1b1b6cc1241ff620 Apr 17 18:02:41.742179 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.742153 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/1.log" Apr 17 18:02:41.746526 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:41.746501 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-722rn_563dc28b-3cac-43af-adc2-31d0202d2905/console-operator/2.log" Apr 17 18:02:42.250209 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:42.250175 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5554d84dd4-42zlp_65400f04-d335-4c62-be8b-b4561c51315e/console/0.log" Apr 17 18:02:42.286561 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:42.286526 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-7dsfs_57a4d511-24de-4e96-ab0c-2099602b447f/download-server/0.log" Apr 17 18:02:42.494820 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:42.494787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" event={"ID":"698360dd-7a63-45bf-81ca-2f83356de9a7","Type":"ContainerStarted","Data":"7ebd608b282625d2bf7850a8619640b96d50c8277bb9be0d0f35615ea775432f"} Apr 17 18:02:42.495000 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:42.494825 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" event={"ID":"698360dd-7a63-45bf-81ca-2f83356de9a7","Type":"ContainerStarted","Data":"9471d14f6af3fa3b93a53e40ca842e676275dea053ec16ac1b1b6cc1241ff620"} Apr 17 18:02:42.495000 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:42.494981 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:42.521184 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:42.521133 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" podStartSLOduration=1.521114774 podStartE2EDuration="1.521114774s" podCreationTimestamp="2026-04-17 18:02:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:02:42.519444652 +0000 UTC m=+2189.094792940" watchObservedRunningTime="2026-04-17 18:02:42.521114774 +0000 UTC m=+2189.096463062" Apr 17 18:02:42.857357 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:42.857278 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-k7wps_59eaa364-77a3-4e8d-b694-75e3c4a185b8/volume-data-source-validator/0.log" Apr 17 18:02:43.809391 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:43.809362 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-smpcv_e0d9b600-a7c0-458a-b964-3feb2fe753c5/dns/0.log" Apr 17 18:02:43.839807 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:43.839777 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-smpcv_e0d9b600-a7c0-458a-b964-3feb2fe753c5/kube-rbac-proxy/0.log" Apr 17 18:02:43.896400 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:43.896368 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lrfkm_c578d9f0-5deb-4aba-b916-5f7c6bec807d/dns-node-resolver/0.log" Apr 17 18:02:44.528058 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:44.528007 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wxqd8_141084a6-6657-4416-a7f1-21339dfd8b0a/node-ca/0.log" Apr 17 18:02:45.605494 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:45.605465 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-598f578945-4x886_341f200d-f57e-4466-b96c-0b3d9a8d03ff/kube-auth-proxy/0.log" Apr 17 18:02:46.356134 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:46.356103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cgfbj_9e4659f9-8397-426b-a4db-39edf813f27a/serve-healthcheck-canary/0.log" Apr 17 18:02:46.894358 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:46.894320 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-vz885_32d852f3-1d88-49d6-93e3-d36b1a499102/insights-operator/0.log" Apr 17 18:02:46.894744 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:46.894654 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-vz885_32d852f3-1d88-49d6-93e3-d36b1a499102/insights-operator/1.log" Apr 17 18:02:46.999586 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:46.999556 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bqbpv_e7ce222f-b049-4136-aa1a-e11fe5ae538b/kube-rbac-proxy/0.log" Apr 17 18:02:47.030650 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:47.030622 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bqbpv_e7ce222f-b049-4136-aa1a-e11fe5ae538b/exporter/0.log" Apr 17 18:02:47.064119 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:47.064094 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-bqbpv_e7ce222f-b049-4136-aa1a-e11fe5ae538b/extractor/0.log" Apr 17 18:02:48.508784 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:48.508758 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8rk54/perf-node-gather-daemonset-bsmjl" Apr 17 18:02:49.330617 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:49.330585 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-77fb85d776-4gb5w_db0b2ad3-0434-48fc-b7d7-385e8237ba41/manager/0.log" Apr 17 18:02:50.762604 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:50.762568 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6f45766749-7dhmf_8b68fd41-28f3-4939-80d5-5fc14f95771b/manager/0.log" Apr 17 18:02:50.820843 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:50.820811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-g7jcb_1eb573e0-c340-400e-a9ee-c6bc80eddb58/openshift-lws-operator/0.log" Apr 17 18:02:55.681215 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:55.681188 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-hkj2b_32e7e58d-4b83-4c34-9bc6-023ef74bde5d/migrator/0.log" Apr 17 18:02:55.708063 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:55.708009 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-hkj2b_32e7e58d-4b83-4c34-9bc6-023ef74bde5d/graceful-termination/0.log" Apr 17 18:02:56.118860 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:56.118825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-hp4p9_4ef3c872-c400-4d98-9028-72e95653a455/kube-storage-version-migrator-operator/1.log" Apr 17 18:02:56.119819 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:56.119800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-hp4p9_4ef3c872-c400-4d98-9028-72e95653a455/kube-storage-version-migrator-operator/0.log" Apr 17 18:02:57.672686 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:57.672650 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tpw89_3d6308be-3783-4dbc-bf78-b2d0765d73c8/kube-multus-additional-cni-plugins/0.log" Apr 17 18:02:57.700844 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:57.700813 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tpw89_3d6308be-3783-4dbc-bf78-b2d0765d73c8/egress-router-binary-copy/0.log" Apr 17 18:02:57.727392 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:57.727364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tpw89_3d6308be-3783-4dbc-bf78-b2d0765d73c8/cni-plugins/0.log" Apr 17 18:02:57.768445 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:57.768405 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tpw89_3d6308be-3783-4dbc-bf78-b2d0765d73c8/bond-cni-plugin/0.log" Apr 17 18:02:57.791038 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:57.790997 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tpw89_3d6308be-3783-4dbc-bf78-b2d0765d73c8/routeoverride-cni/0.log" Apr 17 18:02:57.821603 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:57.821578 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tpw89_3d6308be-3783-4dbc-bf78-b2d0765d73c8/whereabouts-cni-bincopy/0.log" Apr 17 18:02:57.844825 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:57.844802 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tpw89_3d6308be-3783-4dbc-bf78-b2d0765d73c8/whereabouts-cni/0.log" Apr 17 18:02:57.881401 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:57.881365 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f76vg_5825282c-f2bb-4812-ae37-269c52c423f7/kube-multus/0.log" Apr 17 18:02:58.011865 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:58.011832 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l28wh_c56ede72-4e1e-4a75-9ebe-eabfdfcd2065/network-metrics-daemon/0.log" Apr 17 18:02:58.032789 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:58.032757 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-l28wh_c56ede72-4e1e-4a75-9ebe-eabfdfcd2065/kube-rbac-proxy/0.log" Apr 17 18:02:59.257760 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:59.257731 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-controller/0.log" Apr 17 18:02:59.278715 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:59.278685 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/0.log" Apr 17 18:02:59.288579 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:59.288545 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/ovn-acl-logging/1.log" Apr 17 18:02:59.312703 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:59.312674 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/kube-rbac-proxy-node/0.log" Apr 17 18:02:59.338863 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:59.338832 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:02:59.361433 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:59.361405 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/northd/0.log" Apr 17 18:02:59.387089 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:59.387065 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/nbdb/0.log" Apr 17 18:02:59.412908 ip-10-0-137-109 kubenswrapper[2576]: I0417 18:02:59.412879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5gqt_47b24c49-3baa-47b7-9b2e-e0fd6d27367d/sbdb/0.log"