Apr 22 18:42:16.591794 ip-10-0-137-223 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:42:17.055886 ip-10-0-137-223 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:42:17.055886 ip-10-0-137-223 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:42:17.055886 ip-10-0-137-223 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:42:17.055886 ip-10-0-137-223 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:42:17.055886 ip-10-0-137-223 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:42:17.057605 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.057517 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:42:17.059919 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059902 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:17.059919 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059918 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059922 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059927 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059930 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059933 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059936 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059938 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059941 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059943 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059946 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059956 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059959 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059962 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059964 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059967 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059969 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059972 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059975 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059977 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:17.059981 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059980 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059983 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059986 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059990 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059993 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059995 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.059998 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060000 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060003 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060007 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060011 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060014 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060016 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060019 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060021 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060024 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060032 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060037 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060040 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:17.060444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060044 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060047 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060050 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060052 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060056 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060058 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060060 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060063 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060065 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060068 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060071 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060073 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060076 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060078 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060081 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060085 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060087 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060090 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060093 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060096 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:17.060927 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060099 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060101 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060104 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060106 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060109 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060111 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060114 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060116 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060120 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060122 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060130 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060133 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060135 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060138 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060140 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060142 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060146 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060148 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060151 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060153 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:17.061449 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060156 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060158 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060161 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060163 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060165 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060168 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060170 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060631 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060638 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060641 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060644 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060646 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060649 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060652 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060654 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060657 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060659 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060662 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060664 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060667 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:17.061955 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060670 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060672 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060675 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060678 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060681 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060683 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060686 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060689 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060691 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060694 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060696 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060699 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060701 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060704 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060706 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060709 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060711 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060713 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060717 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060720 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:17.062469 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060722 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060725 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060727 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060730 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060733 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060736 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060738 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060741 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060744 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060747 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060749 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060752 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060754 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060757 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060759 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060770 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060773 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060776 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060779 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060781 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:17.062994 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060784 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060786 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060791 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060795 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060798 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060801 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060804 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060806 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060809 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060812 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060814 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060817 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060819 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060822 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060825 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060827 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060830 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060832 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060835 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060837 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:17.063516 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060840 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060842 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060845 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060847 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060854 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060858 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060862 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060864 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060868 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060871 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060874 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060876 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.060880 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061609 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061618 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061625 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061629 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061634 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061638 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061643 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:42:17.064009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061647 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061651 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061654 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061658 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061661 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061664 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061667 2568 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061670 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061673 2568 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061677 2568 flags.go:64] FLAG: --cloud-config="" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061679 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061682 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061688 2568 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061691 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061694 2568 flags.go:64] FLAG: --config-dir="" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061697 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061701 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061705 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061709 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061711 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061715 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061719 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061722 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061725 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061731 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:42:17.064486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061734 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061739 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061742 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061745 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061748 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061752 2568 flags.go:64] FLAG: --enable-server="true" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061755 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061760 2568 flags.go:64] FLAG: --event-burst="100" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061763 2568 flags.go:64] FLAG: --event-qps="50" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061766 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061769 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061772 2568 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061776 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061779 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061782 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061786 2568 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061789 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061792 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061795 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061798 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061801 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061804 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061806 2568 flags.go:64] FLAG: --feature-gates="" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061811 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061814 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:42:17.065116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061817 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061820 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061823 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061826 2568 flags.go:64] FLAG: --help="false" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061830 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061833 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061838 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061840 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061844 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061848 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061851 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061854 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061857 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061860 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061863 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061867 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061869 2568 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061873 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061876 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061879 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061882 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061885 2568 flags.go:64] FLAG: --lock-file="" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061888 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061891 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:42:17.065736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061894 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061900 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061903 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061906 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061909 2568 flags.go:64] FLAG: --logging-format="text" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061912 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061915 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061918 2568 flags.go:64] FLAG: --manifest-url="" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061921 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061925 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061928 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061932 2568 flags.go:64] FLAG: --max-pods="110" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061935 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061938 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061943 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061946 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061949 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061952 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061956 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061964 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061967 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061971 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061974 2568 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:42:17.066316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061977 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061983 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061986 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061989 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061992 2568 flags.go:64] FLAG: --port="10250" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061995 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.061998 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ae91f503534f06a5" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062001 2568 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062004 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062007 2568 flags.go:64] FLAG: --register-node="true" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062010 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062013 2568 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062017 2568 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062020 2568 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062023 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062026 2568 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062029 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062032 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062036 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062039 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062042 2568 flags.go:64] FLAG: --runonce="false" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062045 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062048 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062051 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062056 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062059 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:42:17.066978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062062 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062065 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062068 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062072 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062075 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062078 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062081 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062084 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062087 2568 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062090 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062095 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062099 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062102 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062106 2568 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062109 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062112 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062115 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062118 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062121 2568 flags.go:64] FLAG: --v="2" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062130 2568 flags.go:64] FLAG: --version="false" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062134 2568 flags.go:64] FLAG: --vmodule="" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062139 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.062142 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063148 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:17.067608 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063156 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063159 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063162 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063166 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063169 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063172 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063183 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063186 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063189 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063192 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063194 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063197 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063200 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063202 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063205 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063207 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063210 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063212 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063215 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063217 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:17.068186 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063221 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063224 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063226 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063229 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063231 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063234 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063237 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063239 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063241 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063244 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063246 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063249 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063252 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063255 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063257 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063260 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063262 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063265 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063268 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063276 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:17.068747 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063279 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063281 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063284 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063287 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063289 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063293 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063296 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063299 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063301 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063304 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063306 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063309 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063311 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063314 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063317 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063319 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063322 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063325 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063327 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063330 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:17.069241 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063332 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063335 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063337 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063340 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063346 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063349 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063352 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063356 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063358 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063361 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063364 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063367 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063375 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063378 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063382 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063384 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063387 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063389 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063392 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:17.069835 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063394 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:17.070606 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063397 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:17.070606 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063399 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:17.070606 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063402 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:17.070606 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063404 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:17.070606 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.063407 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:17.070606 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.064198 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:42:17.072524 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.072486 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:42:17.072524 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.072524 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072576 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072582 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072586 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072589 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072592 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072595 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072597 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072600 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072603 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072606 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072609 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072611 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072614 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072616 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072620 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072623 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072626 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072628 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:17.072641 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072631 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072633 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072636 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072639 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072641 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072644 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072647 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072650 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072653 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072656 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072658 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072661 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072664 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072666 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072669 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072673 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072678 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072681 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072684 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072686 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:17.073108 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072689 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072692 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072695 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072697 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072700 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072702 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072705 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072708 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072710 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072713 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072715 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072718 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072720 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072723 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072727 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072730 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072733 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072735 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072738 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:17.073613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072740 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072743 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072746 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072749 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072751 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072754 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072757 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072760 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072762 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072765 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072767 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072770 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072772 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072775 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072778 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072780 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072783 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072785 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072788 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072797 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:17.074079 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072800 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072802 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072805 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072809 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072813 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072816 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072818 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072822 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072824 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.072830 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.072998 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073004 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073007 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073010 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073013 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:42:17.074613 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073016 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073020 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073022 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073025 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073027 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073030 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073032 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073035 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073038 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073040 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073043 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073046 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073048 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073051 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073053 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073056 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073059 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073063 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073065 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073068 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:42:17.074991 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073070 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073073 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073076 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073078 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073081 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073084 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073087 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073090 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073093 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073095 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073098 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073100 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073103 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073105 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073108 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073111 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073114 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073117 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073120 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073122 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:42:17.075477 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073125 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073128 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073130 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073133 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073135 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073138 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073140 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073143 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073145 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073148 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073152 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073154 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073157 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073160 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073162 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073165 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073168 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073170 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073173 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073175 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:42:17.075977 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073178 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073180 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073183 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073185 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073188 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073191 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073193 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073196 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073198 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073201 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073204 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073206 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073210 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073214 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073218 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073221 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073224 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073227 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073230 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:42:17.076470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073233 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:42:17.076933 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:17.073236 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:42:17.076933 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.073240 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:42:17.076933 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.074008 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:42:17.077909 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.077894 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:42:17.078939 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.078928 2568 server.go:1019] "Starting client certificate rotation" Apr 22 18:42:17.079039 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.079021 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:42:17.079069 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.079061 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:42:17.108342 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.108318 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:42:17.110867 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.110846 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:42:17.124277 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.124251 2568 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:42:17.132728 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.132698 2568 log.go:25] "Validated CRI v1 image API" Apr 22 18:42:17.134071 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.134042 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:42:17.136878 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.136853 2568 fs.go:135] Filesystem UUIDs: map[34b0bf8d-5f8d-4a02-b2b4-9f2dccf97f7b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9783a17d-8b1f-4b7a-ad09-f0d3d8176105:/dev/nvme0n1p4] Apr 22 18:42:17.136950 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.136875 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:42:17.138019 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.138001 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:42:17.142210 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.142096 2568 manager.go:217] Machine: {Timestamp:2026-04-22 18:42:17.140819551 +0000 UTC m=+0.428255744 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3137948 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2cfbb7ff3ff0eaf3b079d282eddbbb SystemUUID:ec2cfbb7-ff3f-f0ea-f3b0-79d282eddbbb BootID:e42110f4-6f50-4196-b9de-d831fc47b3c1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:21:f4:4a:c2:fd Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:21:f4:4a:c2:fd Speed:0 Mtu:9001} {Name:ovs-system MacAddress:26:a7:e6:65:0a:91 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:42:17.142210 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.142205 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:42:17.142334 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.142290 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:42:17.144862 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.144839 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:42:17.145036 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.144865 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-223.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:42:17.145614 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.145604 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:42:17.145652 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.145617 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:42:17.145652 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.145630 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:42:17.145652 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.145644 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:42:17.147035 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.147024 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:42:17.147152 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.147143 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:42:17.149639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.149629 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:42:17.149683 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.149642 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:42:17.149683 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.149654 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:42:17.149683 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.149665 2568 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:42:17.149683 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.149675 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:42:17.150835 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.150823 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:42:17.150883 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.150850 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:42:17.154021 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.153998 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:42:17.155449 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.155431 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:42:17.157563 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157546 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:42:17.157638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157569 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:42:17.157638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157579 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:42:17.157638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157588 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:42:17.157638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157597 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:42:17.157638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157606 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:42:17.157638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157615 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:42:17.157638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157624 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:42:17.157638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157636 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:42:17.157844 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157644 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:42:17.157844 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157656 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:42:17.157844 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.157669 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:42:17.158533 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.158524 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:42:17.158533 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.158533 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:42:17.162166 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.162151 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:42:17.162246 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.162191 2568 server.go:1295] "Started kubelet" Apr 22 18:42:17.162342 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.162304 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:42:17.162393 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.162288 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:42:17.162393 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.162382 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:42:17.163166 ip-10-0-137-223 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:42:17.163814 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.163739 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:42:17.165103 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.165088 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:42:17.165570 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.165551 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-223.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:42:17.165687 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.165591 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:42:17.165846 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.165826 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-223.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:42:17.172085 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.172064 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:42:17.173096 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.172804 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:42:17.173096 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.172965 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fbldp" Apr 22 18:42:17.173769 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.173753 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:42:17.173883 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.173873 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:42:17.173972 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.173886 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:42:17.174052 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.172718 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-223.ec2.internal.18a8c1fb16dc8a37 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-223.ec2.internal,UID:ip-10-0-137-223.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-223.ec2.internal,},FirstTimestamp:2026-04-22 18:42:17.162164791 +0000 UTC m=+0.449600972,LastTimestamp:2026-04-22 18:42:17.162164791 +0000 UTC m=+0.449600972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-223.ec2.internal,}" Apr 22 18:42:17.174140 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.174061 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:42:17.174140 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.174090 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:42:17.174586 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.174558 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:17.174814 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.174777 2568 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:42:17.175119 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.175101 2568 factory.go:55] Registering systemd factory Apr 22 18:42:17.175174 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.175127 2568 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:42:17.175745 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.175320 2568 factory.go:153] Registering CRI-O factory Apr 22 18:42:17.175745 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.175337 2568 factory.go:223] Registration of the crio container factory successfully Apr 22 18:42:17.175745 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.175405 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:42:17.175745 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.175430 2568 factory.go:103] Registering Raw factory Apr 22 18:42:17.175745 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.175447 2568 manager.go:1196] Started watching for new ooms in manager Apr 22 18:42:17.176165 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.175869 2568 manager.go:319] Starting recovery of all containers Apr 22 18:42:17.179393 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.179258 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-223.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:42:17.179393 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.179286 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:42:17.186115 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.186098 2568 manager.go:324] Recovery completed Apr 22 18:42:17.187387 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.187371 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fbldp" Apr 22 18:42:17.187750 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.187733 2568 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 18:42:17.190649 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.190637 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:17.193267 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.193250 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:17.193360 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.193282 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:17.193360 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.193296 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:17.193873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.193854 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:42:17.193873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.193869 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:42:17.193960 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.193886 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:42:17.195410 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.195343 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-223.ec2.internal.18a8c1fb18b720b7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-223.ec2.internal,UID:ip-10-0-137-223.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-223.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-223.ec2.internal,},FirstTimestamp:2026-04-22 18:42:17.193267383 +0000 UTC m=+0.480703551,LastTimestamp:2026-04-22 18:42:17.193267383 +0000 UTC m=+0.480703551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-223.ec2.internal,}" Apr 22 18:42:17.196123 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.196110 2568 policy_none.go:49] "None policy: Start" Apr 22 18:42:17.196174 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.196133 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:42:17.196174 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.196144 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:42:17.245278 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.245255 2568 manager.go:341] "Starting Device Plugin manager" Apr 22 18:42:17.245439 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.245299 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:42:17.245439 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.245312 2568 server.go:85] "Starting device plugin registration server" Apr 22 18:42:17.245643 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.245630 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:42:17.245700 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.245646 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:42:17.245770 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.245745 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:42:17.245866 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.245846 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:42:17.245866 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.245866 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:42:17.246370 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.246346 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:42:17.246431 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.246397 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:17.300283 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.300240 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:42:17.301402 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.301386 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:42:17.301521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.301445 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:42:17.301521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.301472 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:42:17.301521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.301481 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:42:17.301655 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.301537 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:42:17.306472 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.306417 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:17.345825 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.345786 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:17.347011 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.346993 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:17.347098 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.347029 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:17.347098 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.347046 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:17.347098 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.347081 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.357070 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.357050 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.357146 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.357074 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-223.ec2.internal\": node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:17.380868 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.380840 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:17.401773 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.401745 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal"] Apr 22 18:42:17.401865 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.401817 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:17.402788 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.402771 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:17.402878 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.402800 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:17.402878 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.402810 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:17.404064 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.404051 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:17.404215 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.404199 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.404268 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.404229 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:17.404770 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.404747 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:17.404868 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.404768 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:17.404868 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.404778 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:17.404868 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.404792 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:17.404868 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.404794 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:17.404868 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.404808 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:17.405948 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.405929 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.406025 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.405959 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:42:17.406675 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.406654 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:42:17.406752 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.406682 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:42:17.406752 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.406695 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:42:17.422088 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.422066 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-223.ec2.internal\" not found" node="ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.425467 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.425450 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-223.ec2.internal\" not found" node="ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.475055 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.475021 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d039df04fe07725636d2beee207afee3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal\" (UID: \"d039df04fe07725636d2beee207afee3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.475055 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.475052 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/10feef451d98ade143f78a7b02301421-config\") pod \"kube-apiserver-proxy-ip-10-0-137-223.ec2.internal\" (UID: \"10feef451d98ade143f78a7b02301421\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.475258 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.475071 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d039df04fe07725636d2beee207afee3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal\" (UID: \"d039df04fe07725636d2beee207afee3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.481353 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.481335 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:17.575431 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.575364 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d039df04fe07725636d2beee207afee3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal\" (UID: \"d039df04fe07725636d2beee207afee3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.575431 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.575396 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d039df04fe07725636d2beee207afee3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal\" (UID: \"d039df04fe07725636d2beee207afee3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.575431 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.575414 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/10feef451d98ade143f78a7b02301421-config\") pod \"kube-apiserver-proxy-ip-10-0-137-223.ec2.internal\" (UID: \"10feef451d98ade143f78a7b02301421\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.575616 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.575457 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/10feef451d98ade143f78a7b02301421-config\") pod \"kube-apiserver-proxy-ip-10-0-137-223.ec2.internal\" (UID: \"10feef451d98ade143f78a7b02301421\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.575616 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.575461 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d039df04fe07725636d2beee207afee3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal\" (UID: \"d039df04fe07725636d2beee207afee3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.575616 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.575464 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d039df04fe07725636d2beee207afee3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal\" (UID: \"d039df04fe07725636d2beee207afee3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.582186 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.582168 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:17.683203 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.683171 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:17.724376 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.724345 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.727888 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:17.727868 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal" Apr 22 18:42:17.783761 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.783720 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:17.884345 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.884286 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:17.984768 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:17.984742 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:18.024602 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.024576 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:18.079527 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.079488 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:42:18.080190 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.079640 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:42:18.080190 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.079683 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:42:18.085644 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:18.085618 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:18.173055 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.173029 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:42:18.185753 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:18.185723 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:18.185753 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.185735 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:42:18.189752 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.189712 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:37:17 +0000 UTC" deadline="2027-11-29 00:29:29.916249811 +0000 UTC" Apr 22 18:42:18.189806 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.189755 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14045h47m11.726499452s" Apr 22 18:42:18.213691 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.213645 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4dxbq" Apr 22 18:42:18.222531 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.222475 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4dxbq" Apr 22 18:42:18.265519 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:18.265469 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10feef451d98ade143f78a7b02301421.slice/crio-20c845ade2e5a2ef32ddebddade279cd695fa7680c5ce7def42bc575326bbc64 WatchSource:0}: Error finding container 20c845ade2e5a2ef32ddebddade279cd695fa7680c5ce7def42bc575326bbc64: Status 404 returned error can't find the container with id 20c845ade2e5a2ef32ddebddade279cd695fa7680c5ce7def42bc575326bbc64 Apr 22 18:42:18.265696 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:18.265679 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd039df04fe07725636d2beee207afee3.slice/crio-66141811917bd92996500dc591dd0c2aaecb927281930a9b10d1197305ec002c WatchSource:0}: Error finding container 66141811917bd92996500dc591dd0c2aaecb927281930a9b10d1197305ec002c: Status 404 returned error can't find the container with id 66141811917bd92996500dc591dd0c2aaecb927281930a9b10d1197305ec002c Apr 22 18:42:18.270092 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.270075 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:42:18.286742 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:18.286718 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:18.304715 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.304668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" event={"ID":"d039df04fe07725636d2beee207afee3","Type":"ContainerStarted","Data":"66141811917bd92996500dc591dd0c2aaecb927281930a9b10d1197305ec002c"} Apr 22 18:42:18.305635 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.305613 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal" event={"ID":"10feef451d98ade143f78a7b02301421","Type":"ContainerStarted","Data":"20c845ade2e5a2ef32ddebddade279cd695fa7680c5ce7def42bc575326bbc64"} Apr 22 18:42:18.387215 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:18.387176 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:18.438622 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.438545 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:18.487885 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:18.487854 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-223.ec2.internal\" not found" Apr 22 18:42:18.557521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.557486 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:18.574614 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.574580 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" Apr 22 18:42:18.586557 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.586534 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:42:18.587553 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.587531 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal" Apr 22 18:42:18.596317 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:18.596297 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:42:19.150336 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.150303 2568 apiserver.go:52] "Watching apiserver" Apr 22 18:42:19.156977 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.156946 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:42:19.158104 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.158076 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-b2z66","openshift-network-operator/iptables-alerter-7mq2g","kube-system/konnectivity-agent-2sskf","kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b","openshift-image-registry/node-ca-pnwpz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal","openshift-multus/multus-w85bl","openshift-ovn-kubernetes/ovnkube-node-d7r2q","openshift-cluster-node-tuning-operator/tuned-pbvd2","openshift-multus/multus-additional-cni-plugins-bvmpf","openshift-multus/network-metrics-daemon-p7h9z"] Apr 22 18:42:19.160710 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.160677 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:19.160804 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.160756 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:19.161754 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.161731 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.162024 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.161888 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:19.163266 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.163250 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.164108 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.164088 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6725v\"" Apr 22 18:42:19.164199 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.164140 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:42:19.164199 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.164147 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:42:19.164199 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.164182 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zqm56\"" Apr 22 18:42:19.164365 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.164346 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:42:19.164415 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.164404 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:42:19.164485 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.164462 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:42:19.164775 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.164756 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.165431 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.165413 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:42:19.165626 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.165611 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:42:19.165818 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.165797 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:42:19.166181 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.166159 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xnt57\"" Apr 22 18:42:19.166895 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.166878 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:42:19.167029 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.166879 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:42:19.167181 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.167165 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.167274 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.167173 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:42:19.167344 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.167301 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.167619 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.167598 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-75794\"" Apr 22 18:42:19.168600 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.168583 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.170037 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.169984 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:42:19.170131 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170115 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.170292 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170204 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:42:19.170361 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170290 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:42:19.170361 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170116 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:42:19.171384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170458 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-g422j\"" Apr 22 18:42:19.171384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170512 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:42:19.171384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170730 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:42:19.171384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170807 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:42:19.171384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170841 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:42:19.171384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170847 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:42:19.171384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170905 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:42:19.171384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.170807 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:42:19.171384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.171362 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dqsnv\"" Apr 22 18:42:19.171914 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.171722 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:42:19.171914 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.171813 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:19.171914 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.171833 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-4xfr4\"" Apr 22 18:42:19.171914 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.171876 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:19.172531 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.172495 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:42:19.172912 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.172896 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-nd27q\"" Apr 22 18:42:19.173483 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.173465 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:42:19.175301 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.175285 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:42:19.185028 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185006 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-var-lib-cni-bin\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.185148 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185039 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-conf-dir\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.185148 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185074 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-log-socket\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.185148 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-os-release\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.185148 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185119 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7acfec53-d2af-4a00-a80f-87b1b1b045b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.185148 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185135 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-run-systemd\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-run-k8s-cni-cncf-io\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185187 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6hvh\" (UniqueName: \"kubernetes.io/projected/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-kube-api-access-z6hvh\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185202 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-var-lib-cni-multus\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185215 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-hostroot\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-sysconfig\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185269 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6dc635c-cd72-46e9-b939-3250ffd02a4c-tmp\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185321 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdqh6\" (UniqueName: \"kubernetes.io/projected/d0b97240-e44a-4e95-a872-db48e0018120-kube-api-access-jdqh6\") pod \"iptables-alerter-7mq2g\" (UID: \"d0b97240-e44a-4e95-a872-db48e0018120\") " pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185355 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-etc-selinux\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185381 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtfhc\" (UniqueName: \"kubernetes.io/projected/fa08d78c-48ce-42a5-85a9-3f4ae1d8a468-kube-api-access-rtfhc\") pod \"node-ca-pnwpz\" (UID: \"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468\") " pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185405 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-var-lib-openvswitch\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ccbf34b-0127-4661-9440-330712037d5a-env-overrides\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185452 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx49l\" (UniqueName: \"kubernetes.io/projected/5ccbf34b-0127-4661-9440-330712037d5a-kube-api-access-jx49l\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185474 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-cnibin\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.185520 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185495 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-device-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-daemon-config\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185560 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-sysctl-conf\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185583 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-lib-modules\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185629 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185674 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-registration-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185700 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-systemd-units\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185725 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-socket-dir-parent\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185747 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-run-openvswitch\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185781 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-sys-fs\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185820 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-cni-dir\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185849 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-host\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185874 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7acfec53-d2af-4a00-a80f-87b1b1b045b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185901 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.185939 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-run-multus-certs\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186003 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.186161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186043 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d0b97240-e44a-4e95-a872-db48e0018120-iptables-alerter-script\") pod \"iptables-alerter-7mq2g\" (UID: \"d0b97240-e44a-4e95-a872-db48e0018120\") " pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186077 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-socket-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186103 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-run-netns\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186130 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-run-netns\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186154 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-run-ovn-kubernetes\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186204 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ccbf34b-0127-4661-9440-330712037d5a-ovn-node-metrics-cert\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186246 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-modprobe-d\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186273 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7acfec53-d2af-4a00-a80f-87b1b1b045b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186298 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186322 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa08d78c-48ce-42a5-85a9-3f4ae1d8a468-host\") pod \"node-ca-pnwpz\" (UID: \"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468\") " pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186345 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fa08d78c-48ce-42a5-85a9-3f4ae1d8a468-serviceca\") pod \"node-ca-pnwpz\" (UID: \"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468\") " pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-cnibin\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-run\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186416 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-sys\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186438 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b77px\" (UniqueName: \"kubernetes.io/projected/28918ed5-c9e8-4d81-9874-821a4393de68-kube-api-access-b77px\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186466 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-etc-openvswitch\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186515 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-run-ovn\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.186954 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186539 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-tuned\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186567 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186591 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxft7\" (UniqueName: \"kubernetes.io/projected/42b4b135-c4b0-4460-84ed-684f25a4436d-kube-api-access-cxft7\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186620 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-var-lib-kubelet\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186639 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-kubelet\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186656 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-cni-bin\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186675 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ccbf34b-0127-4661-9440-330712037d5a-ovnkube-config\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186690 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-system-cni-dir\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186712 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b31604e3-953f-4f25-9eb6-b50d500412bb-konnectivity-ca\") pod \"konnectivity-agent-2sskf\" (UID: \"b31604e3-953f-4f25-9eb6-b50d500412bb\") " pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186734 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-cni-binary-copy\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186755 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-etc-kubernetes\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186778 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ccbf34b-0127-4661-9440-330712037d5a-ovnkube-script-lib\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186815 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-kubernetes\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-sysctl-d\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186902 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0b97240-e44a-4e95-a872-db48e0018120-host-slash\") pod \"iptables-alerter-7mq2g\" (UID: \"d0b97240-e44a-4e95-a872-db48e0018120\") " pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186926 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b31604e3-953f-4f25-9eb6-b50d500412bb-agent-certs\") pod \"konnectivity-agent-2sskf\" (UID: \"b31604e3-953f-4f25-9eb6-b50d500412bb\") " pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:19.187765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186966 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-system-cni-dir\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.188521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.186990 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-slash\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.188521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.187033 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-systemd\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.188521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.187057 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tctv8\" (UniqueName: \"kubernetes.io/projected/7acfec53-d2af-4a00-a80f-87b1b1b045b1-kube-api-access-tctv8\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.188521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.187078 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-os-release\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.188521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.187095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-node-log\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.188521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.187132 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-cni-netd\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.188521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.187149 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-var-lib-kubelet\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.188521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.187163 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzpkq\" (UniqueName: \"kubernetes.io/projected/b6dc635c-cd72-46e9-b939-3250ffd02a4c-kube-api-access-zzpkq\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.223906 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.223863 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:37:18 +0000 UTC" deadline="2028-01-08 22:35:34.616907688 +0000 UTC" Apr 22 18:42:19.223906 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.223896 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15027h53m15.393015081s" Apr 22 18:42:19.287726 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287697 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fa08d78c-48ce-42a5-85a9-3f4ae1d8a468-serviceca\") pod \"node-ca-pnwpz\" (UID: \"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468\") " pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.287726 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-cnibin\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287748 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-run\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287763 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-sys\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287782 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b77px\" (UniqueName: \"kubernetes.io/projected/28918ed5-c9e8-4d81-9874-821a4393de68-kube-api-access-b77px\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-run\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287854 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-cnibin\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287858 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-etc-openvswitch\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287902 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-etc-openvswitch\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287913 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-run-ovn\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287941 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-tuned\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.287955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287933 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-sys\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287963 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.287990 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxft7\" (UniqueName: \"kubernetes.io/projected/42b4b135-c4b0-4460-84ed-684f25a4436d-kube-api-access-cxft7\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288000 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-run-ovn\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288016 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-var-lib-kubelet\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-kubelet\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-cni-bin\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288113 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-var-lib-kubelet\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ccbf34b-0127-4661-9440-330712037d5a-ovnkube-config\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288146 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288163 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-system-cni-dir\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-kubelet\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b31604e3-953f-4f25-9eb6-b50d500412bb-konnectivity-ca\") pod \"konnectivity-agent-2sskf\" (UID: \"b31604e3-953f-4f25-9eb6-b50d500412bb\") " pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288224 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-system-cni-dir\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288239 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-cni-binary-copy\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288261 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fa08d78c-48ce-42a5-85a9-3f4ae1d8a468-serviceca\") pod \"node-ca-pnwpz\" (UID: \"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468\") " pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-etc-kubernetes\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.288397 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288285 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-cni-bin\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288311 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ccbf34b-0127-4661-9440-330712037d5a-ovnkube-script-lib\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288277 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288347 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-kubernetes\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288371 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-sysctl-d\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288417 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0b97240-e44a-4e95-a872-db48e0018120-host-slash\") pod \"iptables-alerter-7mq2g\" (UID: \"d0b97240-e44a-4e95-a872-db48e0018120\") " pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b31604e3-953f-4f25-9eb6-b50d500412bb-agent-certs\") pod \"konnectivity-agent-2sskf\" (UID: \"b31604e3-953f-4f25-9eb6-b50d500412bb\") " pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288492 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-system-cni-dir\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288534 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-slash\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-systemd\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tctv8\" (UniqueName: \"kubernetes.io/projected/7acfec53-d2af-4a00-a80f-87b1b1b045b1-kube-api-access-tctv8\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288609 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-os-release\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288638 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-node-log\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288664 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-cni-netd\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288690 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-var-lib-kubelet\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288717 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-systemd\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288761 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b31604e3-953f-4f25-9eb6-b50d500412bb-konnectivity-ca\") pod \"konnectivity-agent-2sskf\" (UID: \"b31604e3-953f-4f25-9eb6-b50d500412bb\") " pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288781 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-system-cni-dir\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.289135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288780 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-sysctl-d\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-kubernetes\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288830 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-slash\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288834 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-node-log\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288899 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-os-release\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288368 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-etc-kubernetes\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288959 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0b97240-e44a-4e95-a872-db48e0018120-host-slash\") pod \"iptables-alerter-7mq2g\" (UID: \"d0b97240-e44a-4e95-a872-db48e0018120\") " pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.288663 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ccbf34b-0127-4661-9440-330712037d5a-ovnkube-config\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289097 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-var-lib-kubelet\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289154 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzpkq\" (UniqueName: \"kubernetes.io/projected/b6dc635c-cd72-46e9-b939-3250ffd02a4c-kube-api-access-zzpkq\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289184 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-var-lib-cni-bin\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289211 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-conf-dir\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289275 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-var-lib-cni-bin\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289285 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-conf-dir\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289310 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-log-socket\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289314 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-cni-netd\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289351 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-os-release\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289381 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7acfec53-d2af-4a00-a80f-87b1b1b045b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.289969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289398 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-log-socket\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289408 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-run-systemd\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289434 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-run-k8s-cni-cncf-io\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289460 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6hvh\" (UniqueName: \"kubernetes.io/projected/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-kube-api-access-z6hvh\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289482 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-cni-binary-copy\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289484 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-var-lib-cni-multus\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289533 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-var-lib-cni-multus\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289552 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-hostroot\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289574 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-run-systemd\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289580 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-run-k8s-cni-cncf-io\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-hostroot\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289637 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-sysconfig\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289578 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-sysconfig\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289677 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6dc635c-cd72-46e9-b939-3250ffd02a4c-tmp\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdqh6\" (UniqueName: \"kubernetes.io/projected/d0b97240-e44a-4e95-a872-db48e0018120-kube-api-access-jdqh6\") pod \"iptables-alerter-7mq2g\" (UID: \"d0b97240-e44a-4e95-a872-db48e0018120\") " pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289678 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-os-release\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289728 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-etc-selinux\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289755 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtfhc\" (UniqueName: \"kubernetes.io/projected/fa08d78c-48ce-42a5-85a9-3f4ae1d8a468-kube-api-access-rtfhc\") pod \"node-ca-pnwpz\" (UID: \"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468\") " pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.290821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289778 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-var-lib-openvswitch\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289801 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ccbf34b-0127-4661-9440-330712037d5a-env-overrides\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289826 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx49l\" (UniqueName: \"kubernetes.io/projected/5ccbf34b-0127-4661-9440-330712037d5a-kube-api-access-jx49l\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289851 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-cnibin\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289877 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-device-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289901 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-daemon-config\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-sysctl-conf\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289934 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-var-lib-openvswitch\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289951 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-lib-modules\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289976 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289976 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ccbf34b-0127-4661-9440-330712037d5a-ovnkube-script-lib\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.289991 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-etc-selinux\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290029 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-device-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290061 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-registration-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290085 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-systemd-units\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290093 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-lib-modules\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290109 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-socket-dir-parent\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.291660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290134 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-run-openvswitch\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290158 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-sys-fs\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290183 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-cni-dir\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290197 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-sysctl-conf\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290207 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-host\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290233 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7acfec53-d2af-4a00-a80f-87b1b1b045b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290239 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-systemd-units\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290267 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290297 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-run-multus-certs\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290327 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290354 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d0b97240-e44a-4e95-a872-db48e0018120-iptables-alerter-script\") pod \"iptables-alerter-7mq2g\" (UID: \"d0b97240-e44a-4e95-a872-db48e0018120\") " pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290362 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-registration-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-socket-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290405 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-run-netns\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290430 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-run-netns\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290432 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7acfec53-d2af-4a00-a80f-87b1b1b045b1-cnibin\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290459 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-run-ovn-kubernetes\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.292205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290485 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ccbf34b-0127-4661-9440-330712037d5a-ovn-node-metrics-cert\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290537 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-modprobe-d\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290495 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-run-openvswitch\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290585 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-daemon-config\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290614 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290647 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-host\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-run-multus-certs\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290696 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-sys-fs\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290701 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ccbf34b-0127-4661-9440-330712037d5a-env-overrides\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290749 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-cni-dir\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290757 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7acfec53-d2af-4a00-a80f-87b1b1b045b1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290786 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-run-netns\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-host-run-netns\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7acfec53-d2af-4a00-a80f-87b1b1b045b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290848 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/28918ed5-c9e8-4d81-9874-821a4393de68-socket-dir\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290858 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290885 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa08d78c-48ce-42a5-85a9-3f4ae1d8a468-host\") pod \"node-ca-pnwpz\" (UID: \"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468\") " pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290951 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa08d78c-48ce-42a5-85a9-3f4ae1d8a468-host\") pod \"node-ca-pnwpz\" (UID: \"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468\") " pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.292736 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.290989 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.291028 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-multus-socket-dir-parent\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.291109 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs podName:42b4b135-c4b0-4460-84ed-684f25a4436d nodeName:}" failed. No retries permitted until 2026-04-22 18:42:19.791077303 +0000 UTC m=+3.078513479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs") pod "network-metrics-daemon-p7h9z" (UID: "42b4b135-c4b0-4460-84ed-684f25a4436d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.291151 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.291235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-modprobe-d\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.290989 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ccbf34b-0127-4661-9440-330712037d5a-host-run-ovn-kubernetes\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.291290 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d0b97240-e44a-4e95-a872-db48e0018120-iptables-alerter-script\") pod \"iptables-alerter-7mq2g\" (UID: \"d0b97240-e44a-4e95-a872-db48e0018120\") " pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.291734 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7acfec53-d2af-4a00-a80f-87b1b1b045b1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.292062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7acfec53-d2af-4a00-a80f-87b1b1b045b1-cni-binary-copy\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.292971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b6dc635c-cd72-46e9-b939-3250ffd02a4c-tmp\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.293489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.293186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b31604e3-953f-4f25-9eb6-b50d500412bb-agent-certs\") pod \"konnectivity-agent-2sskf\" (UID: \"b31604e3-953f-4f25-9eb6-b50d500412bb\") " pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:19.294003 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.293542 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b6dc635c-cd72-46e9-b939-3250ffd02a4c-etc-tuned\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.294003 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.293683 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ccbf34b-0127-4661-9440-330712037d5a-ovn-node-metrics-cert\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.300307 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.300002 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxft7\" (UniqueName: \"kubernetes.io/projected/42b4b135-c4b0-4460-84ed-684f25a4436d-kube-api-access-cxft7\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:19.300445 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.300420 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b77px\" (UniqueName: \"kubernetes.io/projected/28918ed5-c9e8-4d81-9874-821a4393de68-kube-api-access-b77px\") pod \"aws-ebs-csi-driver-node-zvx5b\" (UID: \"28918ed5-c9e8-4d81-9874-821a4393de68\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.305914 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.305886 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:19.305914 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.305916 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:19.306240 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.305929 2568 projected.go:194] Error preparing data for projected volume kube-api-access-97rlf for pod openshift-network-diagnostics/network-check-target-b2z66: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:19.306240 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.306007 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf podName:ed2bee6e-74d3-4d8c-be17-c9c35096ad8b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:19.805988715 +0000 UTC m=+3.093424891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-97rlf" (UniqueName: "kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf") pod "network-check-target-b2z66" (UID: "ed2bee6e-74d3-4d8c-be17-c9c35096ad8b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:19.308541 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.308515 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtfhc\" (UniqueName: \"kubernetes.io/projected/fa08d78c-48ce-42a5-85a9-3f4ae1d8a468-kube-api-access-rtfhc\") pod \"node-ca-pnwpz\" (UID: \"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468\") " pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.308937 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.308910 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx49l\" (UniqueName: \"kubernetes.io/projected/5ccbf34b-0127-4661-9440-330712037d5a-kube-api-access-jx49l\") pod \"ovnkube-node-d7r2q\" (UID: \"5ccbf34b-0127-4661-9440-330712037d5a\") " pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.309386 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.309361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tctv8\" (UniqueName: \"kubernetes.io/projected/7acfec53-d2af-4a00-a80f-87b1b1b045b1-kube-api-access-tctv8\") pod \"multus-additional-cni-plugins-bvmpf\" (UID: \"7acfec53-d2af-4a00-a80f-87b1b1b045b1\") " pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.309532 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.309494 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzpkq\" (UniqueName: \"kubernetes.io/projected/b6dc635c-cd72-46e9-b939-3250ffd02a4c-kube-api-access-zzpkq\") pod \"tuned-pbvd2\" (UID: \"b6dc635c-cd72-46e9-b939-3250ffd02a4c\") " pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.309951 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.309928 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdqh6\" (UniqueName: \"kubernetes.io/projected/d0b97240-e44a-4e95-a872-db48e0018120-kube-api-access-jdqh6\") pod \"iptables-alerter-7mq2g\" (UID: \"d0b97240-e44a-4e95-a872-db48e0018120\") " pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.310003 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.309960 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6hvh\" (UniqueName: \"kubernetes.io/projected/72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc-kube-api-access-z6hvh\") pod \"multus-w85bl\" (UID: \"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc\") " pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.466634 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.466555 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:19.474972 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.474946 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7mq2g" Apr 22 18:42:19.482753 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.482731 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:19.491433 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.491405 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" Apr 22 18:42:19.496097 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.496078 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pnwpz" Apr 22 18:42:19.503739 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.503718 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-w85bl" Apr 22 18:42:19.509623 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.509605 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:19.515131 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.515113 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" Apr 22 18:42:19.520659 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.520637 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" Apr 22 18:42:19.793646 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.793580 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:19.793787 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.793695 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:19.793787 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.793757 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs podName:42b4b135-c4b0-4460-84ed-684f25a4436d nodeName:}" failed. No retries permitted until 2026-04-22 18:42:20.793738029 +0000 UTC m=+4.081174199 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs") pod "network-metrics-daemon-p7h9z" (UID: "42b4b135-c4b0-4460-84ed-684f25a4436d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:19.830110 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:19.829943 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28918ed5_c9e8_4d81_9874_821a4393de68.slice/crio-f2f3716831121f30e03db55266540f43ca4e9fc976adc5cc5638782f6a01a6fc WatchSource:0}: Error finding container f2f3716831121f30e03db55266540f43ca4e9fc976adc5cc5638782f6a01a6fc: Status 404 returned error can't find the container with id f2f3716831121f30e03db55266540f43ca4e9fc976adc5cc5638782f6a01a6fc Apr 22 18:42:19.830659 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:19.830634 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dc635c_cd72_46e9_b939_3250ffd02a4c.slice/crio-9cbf13eef449b5447081412f47ed187c44d83fcc0ab40c3f9ae99d0b9d5e5c93 WatchSource:0}: Error finding container 9cbf13eef449b5447081412f47ed187c44d83fcc0ab40c3f9ae99d0b9d5e5c93: Status 404 returned error can't find the container with id 9cbf13eef449b5447081412f47ed187c44d83fcc0ab40c3f9ae99d0b9d5e5c93 Apr 22 18:42:19.831595 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:19.831571 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72d04d6b_6ce2_4128_a4ed_8adc4a19b7bc.slice/crio-4bb4a97172c8cf401ab94da90b825dd20f55afdb1d27cfa831b58f3733232c9e WatchSource:0}: Error finding container 4bb4a97172c8cf401ab94da90b825dd20f55afdb1d27cfa831b58f3733232c9e: Status 404 returned error can't find the container with id 4bb4a97172c8cf401ab94da90b825dd20f55afdb1d27cfa831b58f3733232c9e Apr 22 18:42:19.835700 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:19.835290 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7acfec53_d2af_4a00_a80f_87b1b1b045b1.slice/crio-cf7ee20116ce182865cf58fef0107e1a11c541a442e2605d646e352d15029164 WatchSource:0}: Error finding container cf7ee20116ce182865cf58fef0107e1a11c541a442e2605d646e352d15029164: Status 404 returned error can't find the container with id cf7ee20116ce182865cf58fef0107e1a11c541a442e2605d646e352d15029164 Apr 22 18:42:19.836470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:19.836439 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0b97240_e44a_4e95_a872_db48e0018120.slice/crio-12e5a396ade465b0c8f9234413ae828b1673d6a95742c185fd5fd997d537aed6 WatchSource:0}: Error finding container 12e5a396ade465b0c8f9234413ae828b1673d6a95742c185fd5fd997d537aed6: Status 404 returned error can't find the container with id 12e5a396ade465b0c8f9234413ae828b1673d6a95742c185fd5fd997d537aed6 Apr 22 18:42:19.838371 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:19.838338 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ccbf34b_0127_4661_9440_330712037d5a.slice/crio-a40c370e9d2fe4456ed7abe2339e60b8ed94b6dec8fac9648a82c29808566b5b WatchSource:0}: Error finding container a40c370e9d2fe4456ed7abe2339e60b8ed94b6dec8fac9648a82c29808566b5b: Status 404 returned error can't find the container with id a40c370e9d2fe4456ed7abe2339e60b8ed94b6dec8fac9648a82c29808566b5b Apr 22 18:42:19.839336 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:19.839312 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa08d78c_48ce_42a5_85a9_3f4ae1d8a468.slice/crio-e9bad1723d372809dfcadfe94292a73598bf2b9f0b335b37b044d299e10dd874 WatchSource:0}: Error finding container e9bad1723d372809dfcadfe94292a73598bf2b9f0b335b37b044d299e10dd874: Status 404 returned error can't find the container with id e9bad1723d372809dfcadfe94292a73598bf2b9f0b335b37b044d299e10dd874 Apr 22 18:42:19.840798 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:19.840525 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb31604e3_953f_4f25_9eb6_b50d500412bb.slice/crio-391ddb2be2a2e667b1e42e6d7f149d866af3132ce0e4a0ea9aa3bc0ac8458a7d WatchSource:0}: Error finding container 391ddb2be2a2e667b1e42e6d7f149d866af3132ce0e4a0ea9aa3bc0ac8458a7d: Status 404 returned error can't find the container with id 391ddb2be2a2e667b1e42e6d7f149d866af3132ce0e4a0ea9aa3bc0ac8458a7d Apr 22 18:42:19.894768 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:19.894738 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:19.894872 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.894857 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:19.894914 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.894876 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:19.894914 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.894886 2568 projected.go:194] Error preparing data for projected volume kube-api-access-97rlf for pod openshift-network-diagnostics/network-check-target-b2z66: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:19.894986 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:19.894929 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf podName:ed2bee6e-74d3-4d8c-be17-c9c35096ad8b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:20.894913757 +0000 UTC m=+4.182349927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-97rlf" (UniqueName: "kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf") pod "network-check-target-b2z66" (UID: "ed2bee6e-74d3-4d8c-be17-c9c35096ad8b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:20.225753 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.225611 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:37:18 +0000 UTC" deadline="2028-01-19 21:49:08.628345378 +0000 UTC" Apr 22 18:42:20.225753 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.225649 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15291h6m48.402699587s" Apr 22 18:42:20.327423 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.326789 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal" event={"ID":"10feef451d98ade143f78a7b02301421","Type":"ContainerStarted","Data":"2e20ca507928fd5ade1143e099dbcd8e96e5fdaa2d81719136c49864adcafb1d"} Apr 22 18:42:20.329996 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.329966 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2sskf" event={"ID":"b31604e3-953f-4f25-9eb6-b50d500412bb","Type":"ContainerStarted","Data":"391ddb2be2a2e667b1e42e6d7f149d866af3132ce0e4a0ea9aa3bc0ac8458a7d"} Apr 22 18:42:20.331669 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.331628 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pnwpz" event={"ID":"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468","Type":"ContainerStarted","Data":"e9bad1723d372809dfcadfe94292a73598bf2b9f0b335b37b044d299e10dd874"} Apr 22 18:42:20.337015 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.336990 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7mq2g" event={"ID":"d0b97240-e44a-4e95-a872-db48e0018120","Type":"ContainerStarted","Data":"12e5a396ade465b0c8f9234413ae828b1673d6a95742c185fd5fd997d537aed6"} Apr 22 18:42:20.343290 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.342817 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-223.ec2.internal" podStartSLOduration=2.342802696 podStartE2EDuration="2.342802696s" podCreationTimestamp="2026-04-22 18:42:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:42:20.342195614 +0000 UTC m=+3.629631800" watchObservedRunningTime="2026-04-22 18:42:20.342802696 +0000 UTC m=+3.630238875" Apr 22 18:42:20.346745 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.346585 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:42:20.347647 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.347622 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" event={"ID":"b6dc635c-cd72-46e9-b939-3250ffd02a4c","Type":"ContainerStarted","Data":"9cbf13eef449b5447081412f47ed187c44d83fcc0ab40c3f9ae99d0b9d5e5c93"} Apr 22 18:42:20.356294 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.356244 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" event={"ID":"7acfec53-d2af-4a00-a80f-87b1b1b045b1","Type":"ContainerStarted","Data":"cf7ee20116ce182865cf58fef0107e1a11c541a442e2605d646e352d15029164"} Apr 22 18:42:20.360802 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.360727 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" event={"ID":"28918ed5-c9e8-4d81-9874-821a4393de68","Type":"ContainerStarted","Data":"f2f3716831121f30e03db55266540f43ca4e9fc976adc5cc5638782f6a01a6fc"} Apr 22 18:42:20.371012 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.370975 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" event={"ID":"5ccbf34b-0127-4661-9440-330712037d5a","Type":"ContainerStarted","Data":"a40c370e9d2fe4456ed7abe2339e60b8ed94b6dec8fac9648a82c29808566b5b"} Apr 22 18:42:20.380095 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.380071 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85bl" event={"ID":"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc","Type":"ContainerStarted","Data":"4bb4a97172c8cf401ab94da90b825dd20f55afdb1d27cfa831b58f3733232c9e"} Apr 22 18:42:20.811336 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.811285 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:20.811517 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:20.811487 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:20.811595 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:20.811583 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs podName:42b4b135-c4b0-4460-84ed-684f25a4436d nodeName:}" failed. No retries permitted until 2026-04-22 18:42:22.811562108 +0000 UTC m=+6.098998267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs") pod "network-metrics-daemon-p7h9z" (UID: "42b4b135-c4b0-4460-84ed-684f25a4436d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:20.912528 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:20.912475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:20.912714 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:20.912659 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:20.912714 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:20.912679 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:20.912714 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:20.912691 2568 projected.go:194] Error preparing data for projected volume kube-api-access-97rlf for pod openshift-network-diagnostics/network-check-target-b2z66: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:20.912886 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:20.912761 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf podName:ed2bee6e-74d3-4d8c-be17-c9c35096ad8b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:22.912742067 +0000 UTC m=+6.200178227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-97rlf" (UniqueName: "kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf") pod "network-check-target-b2z66" (UID: "ed2bee6e-74d3-4d8c-be17-c9c35096ad8b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:21.304655 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:21.304599 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:21.305115 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:21.304727 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:21.305227 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:21.305167 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:21.305340 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:21.305266 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:21.402328 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:21.402289 2568 generic.go:358] "Generic (PLEG): container finished" podID="d039df04fe07725636d2beee207afee3" containerID="822dc0dd1de09e0dd981fb5243ea09118d68667f91b883e7ed3bf47f1654905d" exitCode=0 Apr 22 18:42:21.402588 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:21.402472 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" event={"ID":"d039df04fe07725636d2beee207afee3","Type":"ContainerDied","Data":"822dc0dd1de09e0dd981fb5243ea09118d68667f91b883e7ed3bf47f1654905d"} Apr 22 18:42:22.408551 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:22.408285 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" event={"ID":"d039df04fe07725636d2beee207afee3","Type":"ContainerStarted","Data":"c5dd7577b0aed1e43753a1519f565f4c2f7d6232492d87713e829b6e56523753"} Apr 22 18:42:22.830720 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:22.830683 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:22.830890 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:22.830836 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:22.830965 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:22.830903 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs podName:42b4b135-c4b0-4460-84ed-684f25a4436d nodeName:}" failed. No retries permitted until 2026-04-22 18:42:26.830884748 +0000 UTC m=+10.118320908 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs") pod "network-metrics-daemon-p7h9z" (UID: "42b4b135-c4b0-4460-84ed-684f25a4436d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:22.931044 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:22.930994 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:22.931214 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:22.931184 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:22.931214 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:22.931202 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:22.931214 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:22.931216 2568 projected.go:194] Error preparing data for projected volume kube-api-access-97rlf for pod openshift-network-diagnostics/network-check-target-b2z66: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:22.931405 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:22.931274 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf podName:ed2bee6e-74d3-4d8c-be17-c9c35096ad8b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:26.931257943 +0000 UTC m=+10.218694112 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-97rlf" (UniqueName: "kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf") pod "network-check-target-b2z66" (UID: "ed2bee6e-74d3-4d8c-be17-c9c35096ad8b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:23.304145 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:23.303624 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:23.304145 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:23.303752 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:23.304680 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:23.304491 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:23.304680 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:23.304623 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:25.302468 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:25.302361 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:25.302952 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:25.302593 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:25.302952 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:25.302394 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:25.303073 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:25.303026 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:26.863043 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:26.862776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:26.863043 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:26.862935 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:26.863043 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:26.863004 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs podName:42b4b135-c4b0-4460-84ed-684f25a4436d nodeName:}" failed. No retries permitted until 2026-04-22 18:42:34.86298447 +0000 UTC m=+18.150420627 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs") pod "network-metrics-daemon-p7h9z" (UID: "42b4b135-c4b0-4460-84ed-684f25a4436d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:26.963429 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:26.963341 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:26.963604 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:26.963546 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:26.963604 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:26.963573 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:26.963604 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:26.963589 2568 projected.go:194] Error preparing data for projected volume kube-api-access-97rlf for pod openshift-network-diagnostics/network-check-target-b2z66: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:26.963811 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:26.963658 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf podName:ed2bee6e-74d3-4d8c-be17-c9c35096ad8b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:34.96363791 +0000 UTC m=+18.251074070 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-97rlf" (UniqueName: "kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf") pod "network-check-target-b2z66" (UID: "ed2bee6e-74d3-4d8c-be17-c9c35096ad8b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:27.303462 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:27.303372 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:27.303633 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:27.303494 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:27.303895 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:27.303731 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:27.303895 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:27.303854 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:29.301827 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.301794 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:29.302268 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.301794 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:29.302268 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:29.301944 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:29.302268 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:29.301991 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:29.556645 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.556537 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-223.ec2.internal" podStartSLOduration=11.556521578 podStartE2EDuration="11.556521578s" podCreationTimestamp="2026-04-22 18:42:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:42:22.424577558 +0000 UTC m=+5.712013737" watchObservedRunningTime="2026-04-22 18:42:29.556521578 +0000 UTC m=+12.843957755" Apr 22 18:42:29.557145 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.557124 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hn296"] Apr 22 18:42:29.612695 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.612664 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.617129 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.617102 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:42:29.617359 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.617340 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-hztd2\"" Apr 22 18:42:29.617877 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.617857 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:42:29.688317 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.688287 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mds2\" (UniqueName: \"kubernetes.io/projected/7b2ce3e8-c29b-4c51-bc68-022864d2d2fa-kube-api-access-4mds2\") pod \"node-resolver-hn296\" (UID: \"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa\") " pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.688464 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.688329 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b2ce3e8-c29b-4c51-bc68-022864d2d2fa-tmp-dir\") pod \"node-resolver-hn296\" (UID: \"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa\") " pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.688464 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.688357 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7b2ce3e8-c29b-4c51-bc68-022864d2d2fa-hosts-file\") pod \"node-resolver-hn296\" (UID: \"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa\") " pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.789048 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.789017 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7b2ce3e8-c29b-4c51-bc68-022864d2d2fa-hosts-file\") pod \"node-resolver-hn296\" (UID: \"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa\") " pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.789205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.789086 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mds2\" (UniqueName: \"kubernetes.io/projected/7b2ce3e8-c29b-4c51-bc68-022864d2d2fa-kube-api-access-4mds2\") pod \"node-resolver-hn296\" (UID: \"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa\") " pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.789205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.789125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b2ce3e8-c29b-4c51-bc68-022864d2d2fa-tmp-dir\") pod \"node-resolver-hn296\" (UID: \"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa\") " pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.789205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.789151 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7b2ce3e8-c29b-4c51-bc68-022864d2d2fa-hosts-file\") pod \"node-resolver-hn296\" (UID: \"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa\") " pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.789486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.789470 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7b2ce3e8-c29b-4c51-bc68-022864d2d2fa-tmp-dir\") pod \"node-resolver-hn296\" (UID: \"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa\") " pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.799704 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.799680 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mds2\" (UniqueName: \"kubernetes.io/projected/7b2ce3e8-c29b-4c51-bc68-022864d2d2fa-kube-api-access-4mds2\") pod \"node-resolver-hn296\" (UID: \"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa\") " pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:29.922811 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:29.922728 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hn296" Apr 22 18:42:31.302209 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:31.302177 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:31.302693 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:31.302224 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:31.302693 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:31.302314 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:31.302693 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:31.302442 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:33.302590 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:33.302552 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:33.303029 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:33.302557 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:33.303029 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:33.302674 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:33.303029 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:33.302775 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:34.926484 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:34.926432 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:34.926993 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:34.926601 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:34.926993 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:34.926676 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs podName:42b4b135-c4b0-4460-84ed-684f25a4436d nodeName:}" failed. No retries permitted until 2026-04-22 18:42:50.926655802 +0000 UTC m=+34.214091958 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs") pod "network-metrics-daemon-p7h9z" (UID: "42b4b135-c4b0-4460-84ed-684f25a4436d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:35.027652 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:35.027610 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:35.027832 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:35.027785 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:35.027832 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:35.027810 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:35.027832 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:35.027822 2568 projected.go:194] Error preparing data for projected volume kube-api-access-97rlf for pod openshift-network-diagnostics/network-check-target-b2z66: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:35.027957 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:35.027875 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf podName:ed2bee6e-74d3-4d8c-be17-c9c35096ad8b nodeName:}" failed. No retries permitted until 2026-04-22 18:42:51.027861218 +0000 UTC m=+34.315297374 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-97rlf" (UniqueName: "kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf") pod "network-check-target-b2z66" (UID: "ed2bee6e-74d3-4d8c-be17-c9c35096ad8b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:35.302626 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:35.302596 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:35.302814 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:35.302633 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:35.302814 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:35.302726 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:35.302925 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:35.302858 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:36.442274 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:36.442244 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b2ce3e8_c29b_4c51_bc68_022864d2d2fa.slice/crio-295bf7ee03cd1e072eca9ea2dbfea0de2f02f83761ec83faa910946c7c1cefe5 WatchSource:0}: Error finding container 295bf7ee03cd1e072eca9ea2dbfea0de2f02f83761ec83faa910946c7c1cefe5: Status 404 returned error can't find the container with id 295bf7ee03cd1e072eca9ea2dbfea0de2f02f83761ec83faa910946c7c1cefe5 Apr 22 18:42:37.302948 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.302644 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:37.303126 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.302727 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:37.303126 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:37.303045 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:37.303235 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:37.303144 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:37.433990 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.433957 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-2sskf" event={"ID":"b31604e3-953f-4f25-9eb6-b50d500412bb","Type":"ContainerStarted","Data":"23b2a9093987b02951d567b8d94d30cfecca2a5bbc5224a1119cfbdbf7185660"} Apr 22 18:42:37.435550 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.435523 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pnwpz" event={"ID":"fa08d78c-48ce-42a5-85a9-3f4ae1d8a468","Type":"ContainerStarted","Data":"c41f833831cab50d692b310c6cab1c96fb1009d99303d45d4d307151ed96683b"} Apr 22 18:42:37.436873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.436848 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" event={"ID":"b6dc635c-cd72-46e9-b939-3250ffd02a4c","Type":"ContainerStarted","Data":"9b9521703043a4d622c7c97d0a2f915a9299ef5187da9184a4b4fef617b7175e"} Apr 22 18:42:37.438184 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.438159 2568 generic.go:358] "Generic (PLEG): container finished" podID="7acfec53-d2af-4a00-a80f-87b1b1b045b1" containerID="de8c73633000282dda6386c4b7b019b3349710d430b2441b25353fee513c70d3" exitCode=0 Apr 22 18:42:37.438262 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.438231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" event={"ID":"7acfec53-d2af-4a00-a80f-87b1b1b045b1","Type":"ContainerDied","Data":"de8c73633000282dda6386c4b7b019b3349710d430b2441b25353fee513c70d3"} Apr 22 18:42:37.439834 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.439813 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" event={"ID":"28918ed5-c9e8-4d81-9874-821a4393de68","Type":"ContainerStarted","Data":"cc6269ad00d61bd9181824852137738cc834b7547d0bb7f77cb09f6a335e4431"} Apr 22 18:42:37.443011 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.442806 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" event={"ID":"5ccbf34b-0127-4661-9440-330712037d5a","Type":"ContainerStarted","Data":"43fdbba25948eb61492c82a715165b9f6864e553a2e23a6cc6764950ec846ab0"} Apr 22 18:42:37.443011 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.442836 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" event={"ID":"5ccbf34b-0127-4661-9440-330712037d5a","Type":"ContainerStarted","Data":"aec3a0e867d92298d3b27853262259194a3af376bd1a661c217723bd07769192"} Apr 22 18:42:37.443011 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.442850 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" event={"ID":"5ccbf34b-0127-4661-9440-330712037d5a","Type":"ContainerStarted","Data":"fc9b7dd64a727ddeb6a75efcf1ef484b4bf8c49f77e118ce647ab5573e1a07be"} Apr 22 18:42:37.443011 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.442863 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" event={"ID":"5ccbf34b-0127-4661-9440-330712037d5a","Type":"ContainerStarted","Data":"63615eb32e92e85c8a6a1782a58f020674e0bee22659bbc0211fd1d3572cba02"} Apr 22 18:42:37.443011 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.442875 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" event={"ID":"5ccbf34b-0127-4661-9440-330712037d5a","Type":"ContainerStarted","Data":"d3ea282b01f91d01f97e7a9090322aa8852f58ae2f16a47dabff78d917871798"} Apr 22 18:42:37.443011 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.442890 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" event={"ID":"5ccbf34b-0127-4661-9440-330712037d5a","Type":"ContainerStarted","Data":"e307f240c480b02df7549515e7c9f5aee06b68e9bba4f6f6ded3c3bf70371384"} Apr 22 18:42:37.444466 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.444215 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-w85bl" event={"ID":"72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc","Type":"ContainerStarted","Data":"7588561585cd6bf863601a09c3c4aef55f4264f44e2b89571ffa79df576fd563"} Apr 22 18:42:37.445953 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.445930 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hn296" event={"ID":"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa","Type":"ContainerStarted","Data":"a29434c229e6a04fd871713c4b298c745428286f88f712ace3bc36572e30856b"} Apr 22 18:42:37.446047 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.445965 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hn296" event={"ID":"7b2ce3e8-c29b-4c51-bc68-022864d2d2fa","Type":"ContainerStarted","Data":"295bf7ee03cd1e072eca9ea2dbfea0de2f02f83761ec83faa910946c7c1cefe5"} Apr 22 18:42:37.480545 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.480481 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-2sskf" podStartSLOduration=3.893523572 podStartE2EDuration="20.480463972s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:19.842356693 +0000 UTC m=+3.129792848" lastFinishedPulling="2026-04-22 18:42:36.429297088 +0000 UTC m=+19.716733248" observedRunningTime="2026-04-22 18:42:37.456213682 +0000 UTC m=+20.743649860" watchObservedRunningTime="2026-04-22 18:42:37.480463972 +0000 UTC m=+20.767900150" Apr 22 18:42:37.529268 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.529088 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:42:37.532749 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.532712 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pbvd2" podStartSLOduration=3.935653619 podStartE2EDuration="20.53269935s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:19.832203154 +0000 UTC m=+3.119639310" lastFinishedPulling="2026-04-22 18:42:36.429248874 +0000 UTC m=+19.716685041" observedRunningTime="2026-04-22 18:42:37.481705319 +0000 UTC m=+20.769141509" watchObservedRunningTime="2026-04-22 18:42:37.53269935 +0000 UTC m=+20.820135528" Apr 22 18:42:37.622275 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.622190 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hn296" podStartSLOduration=8.622176062 podStartE2EDuration="8.622176062s" podCreationTimestamp="2026-04-22 18:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:42:37.570035224 +0000 UTC m=+20.857471413" watchObservedRunningTime="2026-04-22 18:42:37.622176062 +0000 UTC m=+20.909612239" Apr 22 18:42:37.622415 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.622397 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-w85bl" podStartSLOduration=3.842289483 podStartE2EDuration="20.622391696s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:19.835574971 +0000 UTC m=+3.123011141" lastFinishedPulling="2026-04-22 18:42:36.615677194 +0000 UTC m=+19.903113354" observedRunningTime="2026-04-22 18:42:37.622104553 +0000 UTC m=+20.909540752" watchObservedRunningTime="2026-04-22 18:42:37.622391696 +0000 UTC m=+20.909827873" Apr 22 18:42:37.648532 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:37.648470 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pnwpz" podStartSLOduration=4.06140723 podStartE2EDuration="20.648457083s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:19.842187124 +0000 UTC m=+3.129623280" lastFinishedPulling="2026-04-22 18:42:36.429236961 +0000 UTC m=+19.716673133" observedRunningTime="2026-04-22 18:42:37.648247894 +0000 UTC m=+20.935684071" watchObservedRunningTime="2026-04-22 18:42:37.648457083 +0000 UTC m=+20.935893261" Apr 22 18:42:38.260515 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:38.259569 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:42:37.529265329Z","UUID":"23e185aa-10b3-4103-b67f-7830fb656f10","Handler":null,"Name":"","Endpoint":""} Apr 22 18:42:38.262195 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:38.262171 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:42:38.262296 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:38.262204 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:42:38.449180 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:38.449149 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7mq2g" event={"ID":"d0b97240-e44a-4e95-a872-db48e0018120","Type":"ContainerStarted","Data":"b5293b629ce23e20d64d35e12cf062d3f3d8f3dca9ec068bd68be0cd78811df6"} Apr 22 18:42:38.451464 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:38.451417 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" event={"ID":"28918ed5-c9e8-4d81-9874-821a4393de68","Type":"ContainerStarted","Data":"bda57db0e17dc0bec4482fc0d3970ad2c225a3da48a78200ddfbab1fc7a818f1"} Apr 22 18:42:38.468519 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:38.468470 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7mq2g" podStartSLOduration=4.878548584 podStartE2EDuration="21.468457119s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:19.839327343 +0000 UTC m=+3.126763506" lastFinishedPulling="2026-04-22 18:42:36.429235883 +0000 UTC m=+19.716672041" observedRunningTime="2026-04-22 18:42:38.467988495 +0000 UTC m=+21.755424673" watchObservedRunningTime="2026-04-22 18:42:38.468457119 +0000 UTC m=+21.755893297" Apr 22 18:42:38.483348 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:38.483239 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:38.483976 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:38.483957 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:39.302731 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:39.302698 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:39.302979 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:39.302861 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:39.302979 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:39.302892 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:39.303124 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:39.303046 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:39.455481 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:39.455442 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" event={"ID":"28918ed5-c9e8-4d81-9874-821a4393de68","Type":"ContainerStarted","Data":"cf3f183c734df5eda1aa120165dc0cd61af376c5235f82cf53bd854c420e342d"} Apr 22 18:42:39.458600 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:39.458567 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" event={"ID":"5ccbf34b-0127-4661-9440-330712037d5a","Type":"ContainerStarted","Data":"f86ce1de3d415b2930afe14db4006686a491c42aeb832ce5a0f9c8191ea29702"} Apr 22 18:42:39.479000 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:39.478947 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zvx5b" podStartSLOduration=3.950769327 podStartE2EDuration="22.478933631s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:19.831992565 +0000 UTC m=+3.119428729" lastFinishedPulling="2026-04-22 18:42:38.360156874 +0000 UTC m=+21.647593033" observedRunningTime="2026-04-22 18:42:39.478631788 +0000 UTC m=+22.766067967" watchObservedRunningTime="2026-04-22 18:42:39.478933631 +0000 UTC m=+22.766369808" Apr 22 18:42:40.460443 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:40.460410 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:42:41.301677 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:41.301643 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:41.301846 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:41.301762 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:41.301846 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:41.301814 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:41.301962 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:41.301927 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:41.465936 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:41.465649 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" event={"ID":"5ccbf34b-0127-4661-9440-330712037d5a","Type":"ContainerStarted","Data":"67b4d262afd58f513f6097f59b600d83840d956ab78dcd574d378487b63eb8a4"} Apr 22 18:42:41.467052 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:41.465990 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:41.481487 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:41.481382 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:41.506123 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:41.505767 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" podStartSLOduration=7.851361237 podStartE2EDuration="24.505745974s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:19.840671287 +0000 UTC m=+3.128107458" lastFinishedPulling="2026-04-22 18:42:36.495056035 +0000 UTC m=+19.782492195" observedRunningTime="2026-04-22 18:42:41.504992726 +0000 UTC m=+24.792428945" watchObservedRunningTime="2026-04-22 18:42:41.505745974 +0000 UTC m=+24.793182157" Apr 22 18:42:42.468915 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:42.468885 2568 generic.go:358] "Generic (PLEG): container finished" podID="7acfec53-d2af-4a00-a80f-87b1b1b045b1" containerID="e62a6dd15e9d1a234c9dc10a36268aa2928054b491466348c44f859d269678ce" exitCode=0 Apr 22 18:42:42.469270 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:42.468973 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" event={"ID":"7acfec53-d2af-4a00-a80f-87b1b1b045b1","Type":"ContainerDied","Data":"e62a6dd15e9d1a234c9dc10a36268aa2928054b491466348c44f859d269678ce"} Apr 22 18:42:42.470209 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:42.469545 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:42.470209 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:42.469569 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:42.483691 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:42.483673 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:42:42.832473 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:42.832444 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:42.832613 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:42.832598 2568 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:42:42.833008 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:42.832989 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-2sskf" Apr 22 18:42:43.302693 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:43.302535 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:43.302790 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:43.302493 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:43.302790 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:43.302760 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:43.302900 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:43.302877 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:43.350616 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:43.350585 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p7h9z"] Apr 22 18:42:43.351278 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:43.351255 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b2z66"] Apr 22 18:42:43.472211 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:43.472129 2568 generic.go:358] "Generic (PLEG): container finished" podID="7acfec53-d2af-4a00-a80f-87b1b1b045b1" containerID="8437f980198950a4dae43432e43b724d8df1f55fcf7a7b42409d0b8a7cdcd4eb" exitCode=0 Apr 22 18:42:43.472897 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:43.472231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" event={"ID":"7acfec53-d2af-4a00-a80f-87b1b1b045b1","Type":"ContainerDied","Data":"8437f980198950a4dae43432e43b724d8df1f55fcf7a7b42409d0b8a7cdcd4eb"} Apr 22 18:42:43.472897 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:43.472325 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:43.472897 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:43.472419 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:43.472897 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:43.472426 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:43.472897 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:43.472564 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:44.417305 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.417266 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8krsn"] Apr 22 18:42:44.419300 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.419278 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.419437 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:44.419354 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8krsn" podUID="97b448ce-3744-4ce3-8f4c-027793be2c53" Apr 22 18:42:44.431263 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.431238 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8krsn"] Apr 22 18:42:44.476309 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.476276 2568 generic.go:358] "Generic (PLEG): container finished" podID="7acfec53-d2af-4a00-a80f-87b1b1b045b1" containerID="45d986599781042047132400d9a132bdec069542381a5035b4c15b30c5d59404" exitCode=0 Apr 22 18:42:44.476744 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.476360 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.476744 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.476363 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" event={"ID":"7acfec53-d2af-4a00-a80f-87b1b1b045b1","Type":"ContainerDied","Data":"45d986599781042047132400d9a132bdec069542381a5035b4c15b30c5d59404"} Apr 22 18:42:44.476744 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:44.476654 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8krsn" podUID="97b448ce-3744-4ce3-8f4c-027793be2c53" Apr 22 18:42:44.596432 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.596397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97b448ce-3744-4ce3-8f4c-027793be2c53-kubelet-config\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.596619 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.596495 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97b448ce-3744-4ce3-8f4c-027793be2c53-dbus\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.596619 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.596584 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.697751 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.697664 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97b448ce-3744-4ce3-8f4c-027793be2c53-dbus\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.697751 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.697727 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.697955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.697770 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97b448ce-3744-4ce3-8f4c-027793be2c53-kubelet-config\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.697955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.697883 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/97b448ce-3744-4ce3-8f4c-027793be2c53-kubelet-config\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.697955 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:44.697884 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:44.697955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:44.697912 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/97b448ce-3744-4ce3-8f4c-027793be2c53-dbus\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:44.698139 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:44.697961 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret podName:97b448ce-3744-4ce3-8f4c-027793be2c53 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:45.197937418 +0000 UTC m=+28.485373590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret") pod "global-pull-secret-syncer-8krsn" (UID: "97b448ce-3744-4ce3-8f4c-027793be2c53") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:45.200879 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:45.200840 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:45.201061 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:45.200992 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:45.201113 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:45.201069 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret podName:97b448ce-3744-4ce3-8f4c-027793be2c53 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:46.201049284 +0000 UTC m=+29.488485455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret") pod "global-pull-secret-syncer-8krsn" (UID: "97b448ce-3744-4ce3-8f4c-027793be2c53") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:45.301883 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:45.301846 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:45.302026 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:45.302001 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:45.302203 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:45.302182 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:45.302338 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:45.302296 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:46.208659 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:46.208573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:46.209249 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:46.208704 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:46.209249 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:46.208763 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret podName:97b448ce-3744-4ce3-8f4c-027793be2c53 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:48.208748491 +0000 UTC m=+31.496184648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret") pod "global-pull-secret-syncer-8krsn" (UID: "97b448ce-3744-4ce3-8f4c-027793be2c53") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:46.302013 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:46.301981 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:46.302180 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:46.302135 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8krsn" podUID="97b448ce-3744-4ce3-8f4c-027793be2c53" Apr 22 18:42:47.303780 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:47.303281 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:47.303780 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:47.303372 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:47.303780 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:47.303410 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:47.303780 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:47.303474 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:48.224102 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:48.223912 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:48.224241 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:48.224042 2568 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:48.224241 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:48.224174 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret podName:97b448ce-3744-4ce3-8f4c-027793be2c53 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:52.224160671 +0000 UTC m=+35.511596827 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret") pod "global-pull-secret-syncer-8krsn" (UID: "97b448ce-3744-4ce3-8f4c-027793be2c53") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:42:48.301986 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:48.301950 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:48.302143 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:48.302081 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-8krsn" podUID="97b448ce-3744-4ce3-8f4c-027793be2c53" Apr 22 18:42:49.301840 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.301809 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:49.302278 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:49.301939 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:42:49.302278 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.302099 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:49.302278 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:49.302206 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-b2z66" podUID="ed2bee6e-74d3-4d8c-be17-c9c35096ad8b" Apr 22 18:42:49.564918 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.564841 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-223.ec2.internal" event="NodeReady" Apr 22 18:42:49.565077 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.564977 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:42:49.658209 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.658175 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-j7zw4"] Apr 22 18:42:49.683788 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.683748 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9c2rk"] Apr 22 18:42:49.683947 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.683934 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.686445 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.686423 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5kx7g\"" Apr 22 18:42:49.686895 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.686763 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:42:49.686895 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.686854 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:42:49.698369 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.698346 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9c2rk"] Apr 22 18:42:49.698369 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.698373 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j7zw4"] Apr 22 18:42:49.698533 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.698466 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:49.702540 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.702513 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:42:49.702635 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.702516 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:42:49.702635 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.702627 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrbrj\"" Apr 22 18:42:49.702722 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.702656 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:42:49.836297 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.836216 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmltf\" (UniqueName: \"kubernetes.io/projected/d0da4783-9f5c-40c3-80f0-155df59f22de-kube-api-access-nmltf\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.836297 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.836274 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb26f\" (UniqueName: \"kubernetes.io/projected/b09bc722-d952-4713-bbad-524034fa2063-kube-api-access-wb26f\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:49.836526 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.836337 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0da4783-9f5c-40c3-80f0-155df59f22de-config-volume\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.836526 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.836357 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.836526 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.836393 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:49.836526 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.836412 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d0da4783-9f5c-40c3-80f0-155df59f22de-tmp-dir\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.937597 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.937564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wb26f\" (UniqueName: \"kubernetes.io/projected/b09bc722-d952-4713-bbad-524034fa2063-kube-api-access-wb26f\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:49.937791 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.937609 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0da4783-9f5c-40c3-80f0-155df59f22de-config-volume\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.937791 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.937629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.937791 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.937645 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:49.937791 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.937660 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d0da4783-9f5c-40c3-80f0-155df59f22de-tmp-dir\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.937791 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.937708 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmltf\" (UniqueName: \"kubernetes.io/projected/d0da4783-9f5c-40c3-80f0-155df59f22de-kube-api-access-nmltf\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.937791 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:49.937758 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:49.938198 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:49.937758 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:49.938198 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:49.937827 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls podName:d0da4783-9f5c-40c3-80f0-155df59f22de nodeName:}" failed. No retries permitted until 2026-04-22 18:42:50.437807382 +0000 UTC m=+33.725243538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls") pod "dns-default-j7zw4" (UID: "d0da4783-9f5c-40c3-80f0-155df59f22de") : secret "dns-default-metrics-tls" not found Apr 22 18:42:49.938198 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:49.937845 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert podName:b09bc722-d952-4713-bbad-524034fa2063 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:50.437834789 +0000 UTC m=+33.725270949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert") pod "ingress-canary-9c2rk" (UID: "b09bc722-d952-4713-bbad-524034fa2063") : secret "canary-serving-cert" not found Apr 22 18:42:49.938198 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.938093 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d0da4783-9f5c-40c3-80f0-155df59f22de-tmp-dir\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.938382 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.938289 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0da4783-9f5c-40c3-80f0-155df59f22de-config-volume\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.949427 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.949401 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmltf\" (UniqueName: \"kubernetes.io/projected/d0da4783-9f5c-40c3-80f0-155df59f22de-kube-api-access-nmltf\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:49.949572 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:49.949444 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb26f\" (UniqueName: \"kubernetes.io/projected/b09bc722-d952-4713-bbad-524034fa2063-kube-api-access-wb26f\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:50.302493 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:50.302465 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:50.305165 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:50.305145 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:42:50.441621 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:50.441593 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:50.441718 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:50.441632 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:50.441718 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:50.441710 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:50.441793 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:50.441711 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:50.441793 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:50.441757 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls podName:d0da4783-9f5c-40c3-80f0-155df59f22de nodeName:}" failed. No retries permitted until 2026-04-22 18:42:51.441742466 +0000 UTC m=+34.729178622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls") pod "dns-default-j7zw4" (UID: "d0da4783-9f5c-40c3-80f0-155df59f22de") : secret "dns-default-metrics-tls" not found Apr 22 18:42:50.441793 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:50.441771 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert podName:b09bc722-d952-4713-bbad-524034fa2063 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:51.441764939 +0000 UTC m=+34.729201094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert") pod "ingress-canary-9c2rk" (UID: "b09bc722-d952-4713-bbad-524034fa2063") : secret "canary-serving-cert" not found Apr 22 18:42:50.490456 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:50.490419 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" event={"ID":"7acfec53-d2af-4a00-a80f-87b1b1b045b1","Type":"ContainerStarted","Data":"2b734d9095a6780259bf058f10222726787c5893f188b18fe3527f118786fbe9"} Apr 22 18:42:50.946729 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:50.946699 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:50.946913 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:50.946849 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:50.946913 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:50.946911 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs podName:42b4b135-c4b0-4460-84ed-684f25a4436d nodeName:}" failed. No retries permitted until 2026-04-22 18:43:22.946897201 +0000 UTC m=+66.234333357 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs") pod "network-metrics-daemon-p7h9z" (UID: "42b4b135-c4b0-4460-84ed-684f25a4436d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:42:51.048019 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.047984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:51.048177 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:51.048140 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:42:51.048177 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:51.048159 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:42:51.048177 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:51.048170 2568 projected.go:194] Error preparing data for projected volume kube-api-access-97rlf for pod openshift-network-diagnostics/network-check-target-b2z66: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:51.048337 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:51.048217 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf podName:ed2bee6e-74d3-4d8c-be17-c9c35096ad8b nodeName:}" failed. No retries permitted until 2026-04-22 18:43:23.048204316 +0000 UTC m=+66.335640471 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-97rlf" (UniqueName: "kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf") pod "network-check-target-b2z66" (UID: "ed2bee6e-74d3-4d8c-be17-c9c35096ad8b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:42:51.302160 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.302136 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:42:51.302302 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.302145 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:42:51.305010 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.304990 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:42:51.306014 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.305993 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:42:51.306014 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.306017 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7gnxr\"" Apr 22 18:42:51.306212 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.306026 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:42:51.306212 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.306102 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xrqzc\"" Apr 22 18:42:51.450810 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.450768 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:51.450810 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.450804 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:51.451028 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:51.450894 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:51.451028 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:51.450897 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:51.451028 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:51.450951 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert podName:b09bc722-d952-4713-bbad-524034fa2063 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:53.450935231 +0000 UTC m=+36.738371387 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert") pod "ingress-canary-9c2rk" (UID: "b09bc722-d952-4713-bbad-524034fa2063") : secret "canary-serving-cert" not found Apr 22 18:42:51.451028 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:51.450966 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls podName:d0da4783-9f5c-40c3-80f0-155df59f22de nodeName:}" failed. No retries permitted until 2026-04-22 18:42:53.450959976 +0000 UTC m=+36.738396132 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls") pod "dns-default-j7zw4" (UID: "d0da4783-9f5c-40c3-80f0-155df59f22de") : secret "dns-default-metrics-tls" not found Apr 22 18:42:51.495272 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.495235 2568 generic.go:358] "Generic (PLEG): container finished" podID="7acfec53-d2af-4a00-a80f-87b1b1b045b1" containerID="2b734d9095a6780259bf058f10222726787c5893f188b18fe3527f118786fbe9" exitCode=0 Apr 22 18:42:51.495272 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:51.495283 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" event={"ID":"7acfec53-d2af-4a00-a80f-87b1b1b045b1","Type":"ContainerDied","Data":"2b734d9095a6780259bf058f10222726787c5893f188b18fe3527f118786fbe9"} Apr 22 18:42:52.258277 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:52.258242 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:52.261090 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:52.261066 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/97b448ce-3744-4ce3-8f4c-027793be2c53-original-pull-secret\") pod \"global-pull-secret-syncer-8krsn\" (UID: \"97b448ce-3744-4ce3-8f4c-027793be2c53\") " pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:52.417042 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:52.417002 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8krsn" Apr 22 18:42:52.500329 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:52.500144 2568 generic.go:358] "Generic (PLEG): container finished" podID="7acfec53-d2af-4a00-a80f-87b1b1b045b1" containerID="44babb5c402890751db9fd9f9d6e0f8b83907329cbcece9f7ad615501ab15818" exitCode=0 Apr 22 18:42:52.500461 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:52.500215 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" event={"ID":"7acfec53-d2af-4a00-a80f-87b1b1b045b1","Type":"ContainerDied","Data":"44babb5c402890751db9fd9f9d6e0f8b83907329cbcece9f7ad615501ab15818"} Apr 22 18:42:52.592709 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:52.592677 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8krsn"] Apr 22 18:42:52.600540 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:42:52.599383 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97b448ce_3744_4ce3_8f4c_027793be2c53.slice/crio-997fe3cc3a0412236676aa82bfd2d811a52a42cefae356336ca4108ce1346aac WatchSource:0}: Error finding container 997fe3cc3a0412236676aa82bfd2d811a52a42cefae356336ca4108ce1346aac: Status 404 returned error can't find the container with id 997fe3cc3a0412236676aa82bfd2d811a52a42cefae356336ca4108ce1346aac Apr 22 18:42:53.469142 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:53.469108 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:53.469142 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:53.469147 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:53.469909 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:53.469263 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:53.469909 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:53.469309 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:53.469909 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:53.469325 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls podName:d0da4783-9f5c-40c3-80f0-155df59f22de nodeName:}" failed. No retries permitted until 2026-04-22 18:42:57.469304663 +0000 UTC m=+40.756740820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls") pod "dns-default-j7zw4" (UID: "d0da4783-9f5c-40c3-80f0-155df59f22de") : secret "dns-default-metrics-tls" not found Apr 22 18:42:53.469909 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:53.469358 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert podName:b09bc722-d952-4713-bbad-524034fa2063 nodeName:}" failed. No retries permitted until 2026-04-22 18:42:57.469342292 +0000 UTC m=+40.756778453 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert") pod "ingress-canary-9c2rk" (UID: "b09bc722-d952-4713-bbad-524034fa2063") : secret "canary-serving-cert" not found Apr 22 18:42:53.504136 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:53.504067 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8krsn" event={"ID":"97b448ce-3744-4ce3-8f4c-027793be2c53","Type":"ContainerStarted","Data":"997fe3cc3a0412236676aa82bfd2d811a52a42cefae356336ca4108ce1346aac"} Apr 22 18:42:53.507685 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:53.507649 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" event={"ID":"7acfec53-d2af-4a00-a80f-87b1b1b045b1","Type":"ContainerStarted","Data":"9f19143a9c5ef51fb08d1762a5914be55bd0f387dc20fa7cda8b4934161657de"} Apr 22 18:42:53.536755 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:53.536684 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bvmpf" podStartSLOduration=6.094700926 podStartE2EDuration="36.536665408s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:42:19.837103191 +0000 UTC m=+3.124539350" lastFinishedPulling="2026-04-22 18:42:50.279067661 +0000 UTC m=+33.566503832" observedRunningTime="2026-04-22 18:42:53.535877993 +0000 UTC m=+36.823314197" watchObservedRunningTime="2026-04-22 18:42:53.536665408 +0000 UTC m=+36.824101585" Apr 22 18:42:57.500488 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:57.500445 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:42:57.500488 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:57.500486 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:42:57.501049 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:57.500595 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:42:57.501049 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:57.500617 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:42:57.501049 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:57.500657 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert podName:b09bc722-d952-4713-bbad-524034fa2063 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:05.500643208 +0000 UTC m=+48.788079369 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert") pod "ingress-canary-9c2rk" (UID: "b09bc722-d952-4713-bbad-524034fa2063") : secret "canary-serving-cert" not found Apr 22 18:42:57.501049 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:42:57.500686 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls podName:d0da4783-9f5c-40c3-80f0-155df59f22de nodeName:}" failed. No retries permitted until 2026-04-22 18:43:05.500665794 +0000 UTC m=+48.788101954 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls") pod "dns-default-j7zw4" (UID: "d0da4783-9f5c-40c3-80f0-155df59f22de") : secret "dns-default-metrics-tls" not found Apr 22 18:42:57.516391 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:57.516364 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8krsn" event={"ID":"97b448ce-3744-4ce3-8f4c-027793be2c53","Type":"ContainerStarted","Data":"ca7c5c7da218d7809fc8b814abd62cf9edacd94af9f43206bf494dcdec7b9e63"} Apr 22 18:42:57.533346 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:42:57.533304 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8krsn" podStartSLOduration=9.692190151 podStartE2EDuration="13.533290231s" podCreationTimestamp="2026-04-22 18:42:44 +0000 UTC" firstStartedPulling="2026-04-22 18:42:52.603322903 +0000 UTC m=+35.890759059" lastFinishedPulling="2026-04-22 18:42:56.444422978 +0000 UTC m=+39.731859139" observedRunningTime="2026-04-22 18:42:57.532738138 +0000 UTC m=+40.820174318" watchObservedRunningTime="2026-04-22 18:42:57.533290231 +0000 UTC m=+40.820726403" Apr 22 18:43:05.553084 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:05.553042 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:43:05.553084 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:05.553084 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:43:05.553523 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:05.553182 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:05.553523 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:05.553185 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:05.553523 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:05.553243 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert podName:b09bc722-d952-4713-bbad-524034fa2063 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:21.553229212 +0000 UTC m=+64.840665373 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert") pod "ingress-canary-9c2rk" (UID: "b09bc722-d952-4713-bbad-524034fa2063") : secret "canary-serving-cert" not found Apr 22 18:43:05.553523 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:05.553256 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls podName:d0da4783-9f5c-40c3-80f0-155df59f22de nodeName:}" failed. No retries permitted until 2026-04-22 18:43:21.553250251 +0000 UTC m=+64.840686407 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls") pod "dns-default-j7zw4" (UID: "d0da4783-9f5c-40c3-80f0-155df59f22de") : secret "dns-default-metrics-tls" not found Apr 22 18:43:14.486911 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:14.486882 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d7r2q" Apr 22 18:43:21.563603 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:21.563573 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:43:21.563603 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:21.563609 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:43:21.564029 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:21.563717 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:21.564029 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:21.563717 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:21.564029 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:21.563764 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert podName:b09bc722-d952-4713-bbad-524034fa2063 nodeName:}" failed. No retries permitted until 2026-04-22 18:43:53.563751009 +0000 UTC m=+96.851187165 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert") pod "ingress-canary-9c2rk" (UID: "b09bc722-d952-4713-bbad-524034fa2063") : secret "canary-serving-cert" not found Apr 22 18:43:21.564029 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:21.563778 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls podName:d0da4783-9f5c-40c3-80f0-155df59f22de nodeName:}" failed. No retries permitted until 2026-04-22 18:43:53.563771138 +0000 UTC m=+96.851207293 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls") pod "dns-default-j7zw4" (UID: "d0da4783-9f5c-40c3-80f0-155df59f22de") : secret "dns-default-metrics-tls" not found Apr 22 18:43:22.974916 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:22.974874 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:43:22.977325 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:22.977302 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:43:22.985433 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:22.985408 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:43:22.985546 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:22.985482 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs podName:42b4b135-c4b0-4460-84ed-684f25a4436d nodeName:}" failed. No retries permitted until 2026-04-22 18:44:26.985461809 +0000 UTC m=+130.272897965 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs") pod "network-metrics-daemon-p7h9z" (UID: "42b4b135-c4b0-4460-84ed-684f25a4436d") : secret "metrics-daemon-secret" not found Apr 22 18:43:23.075517 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:23.075448 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:43:23.078417 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:23.078396 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:43:23.088765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:23.088745 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:43:23.099927 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:23.099903 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97rlf\" (UniqueName: \"kubernetes.io/projected/ed2bee6e-74d3-4d8c-be17-c9c35096ad8b-kube-api-access-97rlf\") pod \"network-check-target-b2z66\" (UID: \"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b\") " pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:43:23.119637 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:23.119613 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xrqzc\"" Apr 22 18:43:23.127966 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:23.127949 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:43:23.244659 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:23.244576 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-b2z66"] Apr 22 18:43:23.247606 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:43:23.247578 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded2bee6e_74d3_4d8c_be17_c9c35096ad8b.slice/crio-b847aaf35fbc6d2a6020b9e37a905d9a3b51fee294ce98db66eff4f407a27679 WatchSource:0}: Error finding container b847aaf35fbc6d2a6020b9e37a905d9a3b51fee294ce98db66eff4f407a27679: Status 404 returned error can't find the container with id b847aaf35fbc6d2a6020b9e37a905d9a3b51fee294ce98db66eff4f407a27679 Apr 22 18:43:23.563359 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:23.563324 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b2z66" event={"ID":"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b","Type":"ContainerStarted","Data":"b847aaf35fbc6d2a6020b9e37a905d9a3b51fee294ce98db66eff4f407a27679"} Apr 22 18:43:26.570690 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:26.570653 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-b2z66" event={"ID":"ed2bee6e-74d3-4d8c-be17-c9c35096ad8b","Type":"ContainerStarted","Data":"ffc07b9562c4cd5fe40bbd2c9a8f508a0a8e3a39f9fbefa6c89025388df81d7a"} Apr 22 18:43:26.571061 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:26.570916 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:43:26.592726 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:26.592681 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-b2z66" podStartSLOduration=67.060459292 podStartE2EDuration="1m9.592668494s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:43:23.249439255 +0000 UTC m=+66.536875427" lastFinishedPulling="2026-04-22 18:43:25.78164847 +0000 UTC m=+69.069084629" observedRunningTime="2026-04-22 18:43:26.591936478 +0000 UTC m=+69.879372732" watchObservedRunningTime="2026-04-22 18:43:26.592668494 +0000 UTC m=+69.880104666" Apr 22 18:43:53.591970 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:53.591920 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:43:53.591970 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:53.591975 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:43:53.592447 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:53.592077 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:43:53.592447 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:53.592146 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls podName:d0da4783-9f5c-40c3-80f0-155df59f22de nodeName:}" failed. No retries permitted until 2026-04-22 18:44:57.592129405 +0000 UTC m=+160.879565565 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls") pod "dns-default-j7zw4" (UID: "d0da4783-9f5c-40c3-80f0-155df59f22de") : secret "dns-default-metrics-tls" not found Apr 22 18:43:53.592447 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:53.592077 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:43:53.592447 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:43:53.592218 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert podName:b09bc722-d952-4713-bbad-524034fa2063 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:57.592207281 +0000 UTC m=+160.879643441 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert") pod "ingress-canary-9c2rk" (UID: "b09bc722-d952-4713-bbad-524034fa2063") : secret "canary-serving-cert" not found Apr 22 18:43:57.575213 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:43:57.575182 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-b2z66" Apr 22 18:44:27.022746 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:27.022686 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:44:27.023226 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:27.022837 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:44:27.023226 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:27.022904 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs podName:42b4b135-c4b0-4460-84ed-684f25a4436d nodeName:}" failed. No retries permitted until 2026-04-22 18:46:29.022884837 +0000 UTC m=+252.310321001 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs") pod "network-metrics-daemon-p7h9z" (UID: "42b4b135-c4b0-4460-84ed-684f25a4436d") : secret "metrics-daemon-secret" not found Apr 22 18:44:42.745237 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.745203 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zw28s"] Apr 22 18:44:42.747247 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.747232 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.749871 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.749842 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:44:42.749871 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.749858 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:44:42.749871 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.749845 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:44:42.750051 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.749850 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:44:42.750831 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.750815 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-bggc9\"" Apr 22 18:44:42.754752 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.754735 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:44:42.756951 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.756932 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zw28s"] Apr 22 18:44:42.830829 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.830789 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b8003c92-2bc0-4825-974f-12470b332830-snapshots\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.830829 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.830826 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8003c92-2bc0-4825-974f-12470b332830-service-ca-bundle\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.831056 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.830846 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5xfg\" (UniqueName: \"kubernetes.io/projected/b8003c92-2bc0-4825-974f-12470b332830-kube-api-access-s5xfg\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.831056 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.830932 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8003c92-2bc0-4825-974f-12470b332830-serving-cert\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.831056 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.830988 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8003c92-2bc0-4825-974f-12470b332830-tmp\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.831056 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.831023 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8003c92-2bc0-4825-974f-12470b332830-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932004 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.931948 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8003c92-2bc0-4825-974f-12470b332830-tmp\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932195 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.932015 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8003c92-2bc0-4825-974f-12470b332830-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932195 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.932072 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b8003c92-2bc0-4825-974f-12470b332830-snapshots\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932195 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.932099 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8003c92-2bc0-4825-974f-12470b332830-service-ca-bundle\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932195 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.932126 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5xfg\" (UniqueName: \"kubernetes.io/projected/b8003c92-2bc0-4825-974f-12470b332830-kube-api-access-s5xfg\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932195 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.932156 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8003c92-2bc0-4825-974f-12470b332830-serving-cert\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932453 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.932431 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b8003c92-2bc0-4825-974f-12470b332830-tmp\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932729 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.932707 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8003c92-2bc0-4825-974f-12470b332830-service-ca-bundle\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932729 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.932719 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/b8003c92-2bc0-4825-974f-12470b332830-snapshots\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.932841 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.932801 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b8003c92-2bc0-4825-974f-12470b332830-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.934464 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.934447 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b8003c92-2bc0-4825-974f-12470b332830-serving-cert\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:42.940836 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:42.940812 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5xfg\" (UniqueName: \"kubernetes.io/projected/b8003c92-2bc0-4825-974f-12470b332830-kube-api-access-s5xfg\") pod \"insights-operator-585dfdc468-zw28s\" (UID: \"b8003c92-2bc0-4825-974f-12470b332830\") " pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:43.056454 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:43.056419 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-zw28s" Apr 22 18:44:43.173276 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:43.173241 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-zw28s"] Apr 22 18:44:43.176564 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:44:43.176536 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8003c92_2bc0_4825_974f_12470b332830.slice/crio-a5f4b1aa14d5ac15f4baad0c3ea43f6e2830b2f786849812fe07be4b2acef512 WatchSource:0}: Error finding container a5f4b1aa14d5ac15f4baad0c3ea43f6e2830b2f786849812fe07be4b2acef512: Status 404 returned error can't find the container with id a5f4b1aa14d5ac15f4baad0c3ea43f6e2830b2f786849812fe07be4b2acef512 Apr 22 18:44:43.709803 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:43.709767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zw28s" event={"ID":"b8003c92-2bc0-4825-974f-12470b332830","Type":"ContainerStarted","Data":"a5f4b1aa14d5ac15f4baad0c3ea43f6e2830b2f786849812fe07be4b2acef512"} Apr 22 18:44:45.714731 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:45.714691 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zw28s" event={"ID":"b8003c92-2bc0-4825-974f-12470b332830","Type":"ContainerStarted","Data":"4c2770a5e14b94dc933f7febba5bfdd712f6ca4336a1bc5f49d096c097d7e09c"} Apr 22 18:44:45.731455 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:45.731289 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-zw28s" podStartSLOduration=1.663858407 podStartE2EDuration="3.731277276s" podCreationTimestamp="2026-04-22 18:44:42 +0000 UTC" firstStartedPulling="2026-04-22 18:44:43.178761199 +0000 UTC m=+146.466197358" lastFinishedPulling="2026-04-22 18:44:45.246180072 +0000 UTC m=+148.533616227" observedRunningTime="2026-04-22 18:44:45.730978089 +0000 UTC m=+149.018414267" watchObservedRunningTime="2026-04-22 18:44:45.731277276 +0000 UTC m=+149.018713454" Apr 22 18:44:47.752151 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.752116 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr"] Apr 22 18:44:47.754129 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.754113 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:47.757663 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.757402 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bb4g7\"" Apr 22 18:44:47.757771 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.757730 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:44:47.757876 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.757850 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:44:47.757924 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.757856 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:44:47.763331 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.763308 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr"] Apr 22 18:44:47.871370 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.871330 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvqb9\" (UniqueName: \"kubernetes.io/projected/cd3fc412-4092-4a65-9d81-53003ca26d33-kube-api-access-cvqb9\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:47.871538 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.871478 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:47.972853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.972818 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:47.972853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.972865 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvqb9\" (UniqueName: \"kubernetes.io/projected/cd3fc412-4092-4a65-9d81-53003ca26d33-kube-api-access-cvqb9\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:47.973071 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:47.972967 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:44:47.973071 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:47.973047 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls podName:cd3fc412-4092-4a65-9d81-53003ca26d33 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:48.473029845 +0000 UTC m=+151.760466001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dwddr" (UID: "cd3fc412-4092-4a65-9d81-53003ca26d33") : secret "samples-operator-tls" not found Apr 22 18:44:47.983106 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:47.983082 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvqb9\" (UniqueName: \"kubernetes.io/projected/cd3fc412-4092-4a65-9d81-53003ca26d33-kube-api-access-cvqb9\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:48.477574 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:48.477516 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:48.477758 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:48.477608 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:44:48.477758 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:48.477690 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls podName:cd3fc412-4092-4a65-9d81-53003ca26d33 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:49.47767463 +0000 UTC m=+152.765110785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dwddr" (UID: "cd3fc412-4092-4a65-9d81-53003ca26d33") : secret "samples-operator-tls" not found Apr 22 18:44:48.513169 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:48.513146 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hn296_7b2ce3e8-c29b-4c51-bc68-022864d2d2fa/dns-node-resolver/0.log" Apr 22 18:44:49.313468 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:49.313446 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pnwpz_fa08d78c-48ce-42a5-85a9-3f4ae1d8a468/node-ca/0.log" Apr 22 18:44:49.485052 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:49.485001 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:49.485226 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:49.485151 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:44:49.485226 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:49.485210 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls podName:cd3fc412-4092-4a65-9d81-53003ca26d33 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:51.485195394 +0000 UTC m=+154.772631551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dwddr" (UID: "cd3fc412-4092-4a65-9d81-53003ca26d33") : secret "samples-operator-tls" not found Apr 22 18:44:51.499877 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:51.499844 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:51.500285 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:51.499999 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:44:51.500285 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:51.500058 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls podName:cd3fc412-4092-4a65-9d81-53003ca26d33 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:55.500042753 +0000 UTC m=+158.787478908 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dwddr" (UID: "cd3fc412-4092-4a65-9d81-53003ca26d33") : secret "samples-operator-tls" not found Apr 22 18:44:52.687560 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.687527 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6"] Apr 22 18:44:52.689485 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.689470 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.691691 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.691668 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:44:52.691838 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.691819 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:44:52.691889 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.691849 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-hl5zl\"" Apr 22 18:44:52.692070 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.692055 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:44:52.692717 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.692699 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:44:52.694902 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:52.694874 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-j7zw4" podUID="d0da4783-9f5c-40c3-80f0-155df59f22de" Apr 22 18:44:52.697777 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.697741 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6"] Apr 22 18:44:52.706673 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:52.706648 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9c2rk" podUID="b09bc722-d952-4713-bbad-524034fa2063" Apr 22 18:44:52.727652 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.727627 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j7zw4" Apr 22 18:44:52.810698 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.810667 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtbl9\" (UniqueName: \"kubernetes.io/projected/baa71d25-c811-4836-a063-6c81be046499-kube-api-access-mtbl9\") pod \"service-ca-operator-d6fc45fc5-rvjt6\" (UID: \"baa71d25-c811-4836-a063-6c81be046499\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.810858 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.810720 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa71d25-c811-4836-a063-6c81be046499-config\") pod \"service-ca-operator-d6fc45fc5-rvjt6\" (UID: \"baa71d25-c811-4836-a063-6c81be046499\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.810858 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.810824 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baa71d25-c811-4836-a063-6c81be046499-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rvjt6\" (UID: \"baa71d25-c811-4836-a063-6c81be046499\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.911132 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.911104 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baa71d25-c811-4836-a063-6c81be046499-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rvjt6\" (UID: \"baa71d25-c811-4836-a063-6c81be046499\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.911293 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.911269 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtbl9\" (UniqueName: \"kubernetes.io/projected/baa71d25-c811-4836-a063-6c81be046499-kube-api-access-mtbl9\") pod \"service-ca-operator-d6fc45fc5-rvjt6\" (UID: \"baa71d25-c811-4836-a063-6c81be046499\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.911353 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.911335 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa71d25-c811-4836-a063-6c81be046499-config\") pod \"service-ca-operator-d6fc45fc5-rvjt6\" (UID: \"baa71d25-c811-4836-a063-6c81be046499\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.911866 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.911846 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/baa71d25-c811-4836-a063-6c81be046499-config\") pod \"service-ca-operator-d6fc45fc5-rvjt6\" (UID: \"baa71d25-c811-4836-a063-6c81be046499\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.913291 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.913273 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/baa71d25-c811-4836-a063-6c81be046499-serving-cert\") pod \"service-ca-operator-d6fc45fc5-rvjt6\" (UID: \"baa71d25-c811-4836-a063-6c81be046499\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.919842 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.919820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtbl9\" (UniqueName: \"kubernetes.io/projected/baa71d25-c811-4836-a063-6c81be046499-kube-api-access-mtbl9\") pod \"service-ca-operator-d6fc45fc5-rvjt6\" (UID: \"baa71d25-c811-4836-a063-6c81be046499\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:52.999001 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:52.998921 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" Apr 22 18:44:53.112583 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:53.112555 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6"] Apr 22 18:44:53.116133 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:44:53.116105 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaa71d25_c811_4836_a063_6c81be046499.slice/crio-e76e9cf1edec08b5184f798bacaa9a1a26d1cc483ddf064e48733c1bfaa743c2 WatchSource:0}: Error finding container e76e9cf1edec08b5184f798bacaa9a1a26d1cc483ddf064e48733c1bfaa743c2: Status 404 returned error can't find the container with id e76e9cf1edec08b5184f798bacaa9a1a26d1cc483ddf064e48733c1bfaa743c2 Apr 22 18:44:53.731327 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:53.731276 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" event={"ID":"baa71d25-c811-4836-a063-6c81be046499","Type":"ContainerStarted","Data":"e76e9cf1edec08b5184f798bacaa9a1a26d1cc483ddf064e48733c1bfaa743c2"} Apr 22 18:44:54.312530 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:54.312477 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-p7h9z" podUID="42b4b135-c4b0-4460-84ed-684f25a4436d" Apr 22 18:44:54.752934 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.752858 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6cd49db9df-9t66v"] Apr 22 18:44:54.754983 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.754959 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:54.758116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.758077 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:44:54.758249 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.758203 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:44:54.758355 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.758331 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:44:54.758457 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.758358 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5lrh7\"" Apr 22 18:44:54.763608 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.763590 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:44:54.770975 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.770953 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cd49db9df-9t66v"] Apr 22 18:44:54.925231 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.925194 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-installation-pull-secrets\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:54.925395 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.925267 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:54.925395 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.925293 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9kvf\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-kube-api-access-p9kvf\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:54.925395 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.925335 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-image-registry-private-configuration\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:54.925558 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.925396 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-bound-sa-token\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:54.925558 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.925463 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-trusted-ca\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:54.925558 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.925520 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-registry-certificates\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:54.925666 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:54.925586 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23d4df5d-868e-4a93-8d62-72f442e82013-ca-trust-extracted\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.026876 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.026773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-image-registry-private-configuration\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.026876 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.026836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-bound-sa-token\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.027088 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.026892 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-trusted-ca\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.027088 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.026932 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-registry-certificates\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.027088 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.026976 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23d4df5d-868e-4a93-8d62-72f442e82013-ca-trust-extracted\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.027088 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.027023 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-installation-pull-secrets\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.027088 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.027074 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.027305 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.027099 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9kvf\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-kube-api-access-p9kvf\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.027305 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:55.027194 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:44:55.027305 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:55.027205 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cd49db9df-9t66v: secret "image-registry-tls" not found Apr 22 18:44:55.027305 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:55.027261 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls podName:23d4df5d-868e-4a93-8d62-72f442e82013 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:55.527245541 +0000 UTC m=+158.814681697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls") pod "image-registry-6cd49db9df-9t66v" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013") : secret "image-registry-tls" not found Apr 22 18:44:55.027669 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.027641 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-registry-certificates\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.027835 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.027806 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23d4df5d-868e-4a93-8d62-72f442e82013-ca-trust-extracted\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.028193 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.028154 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-trusted-ca\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.029645 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.029620 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-image-registry-private-configuration\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.029964 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.029942 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-installation-pull-secrets\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.038356 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.038329 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9kvf\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-kube-api-access-p9kvf\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.038668 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.038641 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-bound-sa-token\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.530182 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.530142 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:55.530374 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.530209 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:44:55.530374 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:55.530301 2568 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:44:55.530374 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:55.530305 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:44:55.530374 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:55.530324 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cd49db9df-9t66v: secret "image-registry-tls" not found Apr 22 18:44:55.530374 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:55.530364 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls podName:cd3fc412-4092-4a65-9d81-53003ca26d33 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:03.530349192 +0000 UTC m=+166.817785354 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-dwddr" (UID: "cd3fc412-4092-4a65-9d81-53003ca26d33") : secret "samples-operator-tls" not found Apr 22 18:44:55.530374 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:55.530378 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls podName:23d4df5d-868e-4a93-8d62-72f442e82013 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:56.530370684 +0000 UTC m=+159.817806839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls") pod "image-registry-6cd49db9df-9t66v" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013") : secret "image-registry-tls" not found Apr 22 18:44:55.736510 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.736475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" event={"ID":"baa71d25-c811-4836-a063-6c81be046499","Type":"ContainerStarted","Data":"6960bf106c479b28cc5cca5cd5aedbc6eba8d7f83f5365192555e93260b3f4ae"} Apr 22 18:44:55.752055 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:55.751994 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" podStartSLOduration=1.782077148 podStartE2EDuration="3.751976388s" podCreationTimestamp="2026-04-22 18:44:52 +0000 UTC" firstStartedPulling="2026-04-22 18:44:53.117878044 +0000 UTC m=+156.405314205" lastFinishedPulling="2026-04-22 18:44:55.087777286 +0000 UTC m=+158.375213445" observedRunningTime="2026-04-22 18:44:55.751684733 +0000 UTC m=+159.039120911" watchObservedRunningTime="2026-04-22 18:44:55.751976388 +0000 UTC m=+159.039412572" Apr 22 18:44:56.537893 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:56.537857 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:56.538375 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:56.538027 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:44:56.538375 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:56.538053 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cd49db9df-9t66v: secret "image-registry-tls" not found Apr 22 18:44:56.538375 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:56.538118 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls podName:23d4df5d-868e-4a93-8d62-72f442e82013 nodeName:}" failed. No retries permitted until 2026-04-22 18:44:58.538098342 +0000 UTC m=+161.825534507 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls") pod "image-registry-6cd49db9df-9t66v" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013") : secret "image-registry-tls" not found Apr 22 18:44:57.646401 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:57.646358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:44:57.646401 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:57.646409 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:44:57.646980 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:57.646540 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:44:57.646980 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:57.646600 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:44:57.646980 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:57.646609 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert podName:b09bc722-d952-4713-bbad-524034fa2063 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:59.646586539 +0000 UTC m=+282.934022702 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert") pod "ingress-canary-9c2rk" (UID: "b09bc722-d952-4713-bbad-524034fa2063") : secret "canary-serving-cert" not found Apr 22 18:44:57.646980 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:57.646677 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls podName:d0da4783-9f5c-40c3-80f0-155df59f22de nodeName:}" failed. No retries permitted until 2026-04-22 18:46:59.646662717 +0000 UTC m=+282.934098873 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls") pod "dns-default-j7zw4" (UID: "d0da4783-9f5c-40c3-80f0-155df59f22de") : secret "dns-default-metrics-tls" not found Apr 22 18:44:58.553845 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:58.553806 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:44:58.554015 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:58.553922 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:44:58.554015 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:58.553934 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cd49db9df-9t66v: secret "image-registry-tls" not found Apr 22 18:44:58.554015 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:44:58.553983 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls podName:23d4df5d-868e-4a93-8d62-72f442e82013 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:02.55396875 +0000 UTC m=+165.841404906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls") pod "image-registry-6cd49db9df-9t66v" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013") : secret "image-registry-tls" not found Apr 22 18:44:59.150511 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.150476 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-f8qql"] Apr 22 18:44:59.152539 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.152524 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.154947 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.154917 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:44:59.155077 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.155028 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:44:59.155873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.155856 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hbk5q\"" Apr 22 18:44:59.155873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.155867 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:44:59.156003 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.155916 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:44:59.160455 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.160434 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-f8qql"] Apr 22 18:44:59.259291 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.259255 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfpqd\" (UniqueName: \"kubernetes.io/projected/b0e32025-977e-4d9f-8db0-d33913dfc1e9-kube-api-access-zfpqd\") pod \"service-ca-865cb79987-f8qql\" (UID: \"b0e32025-977e-4d9f-8db0-d33913dfc1e9\") " pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.259291 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.259295 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b0e32025-977e-4d9f-8db0-d33913dfc1e9-signing-cabundle\") pod \"service-ca-865cb79987-f8qql\" (UID: \"b0e32025-977e-4d9f-8db0-d33913dfc1e9\") " pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.259532 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.259422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b0e32025-977e-4d9f-8db0-d33913dfc1e9-signing-key\") pod \"service-ca-865cb79987-f8qql\" (UID: \"b0e32025-977e-4d9f-8db0-d33913dfc1e9\") " pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.359803 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.359741 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfpqd\" (UniqueName: \"kubernetes.io/projected/b0e32025-977e-4d9f-8db0-d33913dfc1e9-kube-api-access-zfpqd\") pod \"service-ca-865cb79987-f8qql\" (UID: \"b0e32025-977e-4d9f-8db0-d33913dfc1e9\") " pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.359803 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.359803 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b0e32025-977e-4d9f-8db0-d33913dfc1e9-signing-cabundle\") pod \"service-ca-865cb79987-f8qql\" (UID: \"b0e32025-977e-4d9f-8db0-d33913dfc1e9\") " pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.360018 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.359997 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b0e32025-977e-4d9f-8db0-d33913dfc1e9-signing-key\") pod \"service-ca-865cb79987-f8qql\" (UID: \"b0e32025-977e-4d9f-8db0-d33913dfc1e9\") " pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.360424 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.360405 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b0e32025-977e-4d9f-8db0-d33913dfc1e9-signing-cabundle\") pod \"service-ca-865cb79987-f8qql\" (UID: \"b0e32025-977e-4d9f-8db0-d33913dfc1e9\") " pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.362470 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.362452 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b0e32025-977e-4d9f-8db0-d33913dfc1e9-signing-key\") pod \"service-ca-865cb79987-f8qql\" (UID: \"b0e32025-977e-4d9f-8db0-d33913dfc1e9\") " pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.368014 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.367988 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfpqd\" (UniqueName: \"kubernetes.io/projected/b0e32025-977e-4d9f-8db0-d33913dfc1e9-kube-api-access-zfpqd\") pod \"service-ca-865cb79987-f8qql\" (UID: \"b0e32025-977e-4d9f-8db0-d33913dfc1e9\") " pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.461224 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.461131 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-f8qql" Apr 22 18:44:59.575429 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.575399 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-f8qql"] Apr 22 18:44:59.578689 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:44:59.578657 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0e32025_977e_4d9f_8db0_d33913dfc1e9.slice/crio-87f87399d52eb6d18ba2d51e9034c993f8a88749a65cf27fc537d0ce926dc740 WatchSource:0}: Error finding container 87f87399d52eb6d18ba2d51e9034c993f8a88749a65cf27fc537d0ce926dc740: Status 404 returned error can't find the container with id 87f87399d52eb6d18ba2d51e9034c993f8a88749a65cf27fc537d0ce926dc740 Apr 22 18:44:59.745946 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.745861 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-f8qql" event={"ID":"b0e32025-977e-4d9f-8db0-d33913dfc1e9","Type":"ContainerStarted","Data":"1e414e1ef9c9471d5ccaf458589e51c509160bfc1f2ed6563e7ea6cc110809ba"} Apr 22 18:44:59.745946 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.745902 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-f8qql" event={"ID":"b0e32025-977e-4d9f-8db0-d33913dfc1e9","Type":"ContainerStarted","Data":"87f87399d52eb6d18ba2d51e9034c993f8a88749a65cf27fc537d0ce926dc740"} Apr 22 18:44:59.780765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.780712 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-f8qql" podStartSLOduration=0.78069395 podStartE2EDuration="780.69395ms" podCreationTimestamp="2026-04-22 18:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:44:59.763984387 +0000 UTC m=+163.051420566" watchObservedRunningTime="2026-04-22 18:44:59.78069395 +0000 UTC m=+163.068130123" Apr 22 18:44:59.782139 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.782117 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hhfz5"] Apr 22 18:44:59.784465 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.784443 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:44:59.787091 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.787066 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-h7vq4\"" Apr 22 18:44:59.787516 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.787485 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:44:59.787893 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.787874 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:44:59.803075 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.803050 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hhfz5"] Apr 22 18:44:59.964917 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.964875 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-data-volume\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:44:59.964917 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.964914 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxw2z\" (UniqueName: \"kubernetes.io/projected/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-kube-api-access-gxw2z\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:44:59.965166 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.964941 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:44:59.965166 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.965054 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-crio-socket\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:44:59.965166 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:44:59.965146 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.066070 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.066040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-data-volume\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.066226 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.066077 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxw2z\" (UniqueName: \"kubernetes.io/projected/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-kube-api-access-gxw2z\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.066226 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.066097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.066226 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.066162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-crio-socket\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.066226 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.066207 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.066430 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.066281 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-crio-socket\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.066430 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:00.066319 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:00.066430 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:00.066378 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls podName:dba4c78e-e2a1-46fe-af9a-af4d512b4e4a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:00.566360418 +0000 UTC m=+163.853796588 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-hhfz5" (UID: "dba4c78e-e2a1-46fe-af9a-af4d512b4e4a") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:00.066578 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.066452 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-data-volume\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.066680 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.066663 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.077844 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.077820 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxw2z\" (UniqueName: \"kubernetes.io/projected/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-kube-api-access-gxw2z\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.569561 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:00.569528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:00.569910 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:00.569678 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:00.569910 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:00.569744 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls podName:dba4c78e-e2a1-46fe-af9a-af4d512b4e4a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:01.569728285 +0000 UTC m=+164.857164442 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-hhfz5" (UID: "dba4c78e-e2a1-46fe-af9a-af4d512b4e4a") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:01.578717 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:01.578676 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:01.579105 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:01.578795 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:01.579105 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:01.578849 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls podName:dba4c78e-e2a1-46fe-af9a-af4d512b4e4a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:03.57883509 +0000 UTC m=+166.866271246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-hhfz5" (UID: "dba4c78e-e2a1-46fe-af9a-af4d512b4e4a") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:02.588485 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:02.588444 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:45:02.588905 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:02.588603 2568 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:45:02.588905 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:02.588624 2568 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6cd49db9df-9t66v: secret "image-registry-tls" not found Apr 22 18:45:02.588905 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:02.588677 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls podName:23d4df5d-868e-4a93-8d62-72f442e82013 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:10.588661376 +0000 UTC m=+173.876097531 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls") pod "image-registry-6cd49db9df-9t66v" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013") : secret "image-registry-tls" not found Apr 22 18:45:03.596642 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:03.596594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:03.597045 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:03.596659 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:45:03.597045 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:03.596763 2568 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 22 18:45:03.597045 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:03.596823 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls podName:dba4c78e-e2a1-46fe-af9a-af4d512b4e4a nodeName:}" failed. No retries permitted until 2026-04-22 18:45:07.596809621 +0000 UTC m=+170.884245780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-hhfz5" (UID: "dba4c78e-e2a1-46fe-af9a-af4d512b4e4a") : secret "insights-runtime-extractor-tls" not found Apr 22 18:45:03.599084 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:03.599066 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cd3fc412-4092-4a65-9d81-53003ca26d33-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-dwddr\" (UID: \"cd3fc412-4092-4a65-9d81-53003ca26d33\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:45:03.663974 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:03.663936 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" Apr 22 18:45:03.790825 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:03.790784 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr"] Apr 22 18:45:04.757542 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:04.757487 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" event={"ID":"cd3fc412-4092-4a65-9d81-53003ca26d33","Type":"ContainerStarted","Data":"a7c5fa8db065618d77485447e29fc570b5c00cb865084bd54579315104f8f234"} Apr 22 18:45:05.761937 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:05.761897 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" event={"ID":"cd3fc412-4092-4a65-9d81-53003ca26d33","Type":"ContainerStarted","Data":"a57b8d74cf235b65621ea60916ee0121cb69a23a26fbd0a68bace5b394ec7a18"} Apr 22 18:45:06.765161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:06.765081 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" event={"ID":"cd3fc412-4092-4a65-9d81-53003ca26d33","Type":"ContainerStarted","Data":"645c2c6319231be723b54d5e22892417b360f0828e804472748b0d473a7e06de"} Apr 22 18:45:06.782757 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:06.782704 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-dwddr" podStartSLOduration=17.959131227 podStartE2EDuration="19.782690455s" podCreationTimestamp="2026-04-22 18:44:47 +0000 UTC" firstStartedPulling="2026-04-22 18:45:03.829422874 +0000 UTC m=+167.116859030" lastFinishedPulling="2026-04-22 18:45:05.652982099 +0000 UTC m=+168.940418258" observedRunningTime="2026-04-22 18:45:06.781711683 +0000 UTC m=+170.069147865" watchObservedRunningTime="2026-04-22 18:45:06.782690455 +0000 UTC m=+170.070126688" Apr 22 18:45:07.303492 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:07.303453 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:45:07.303845 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:07.303821 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:45:07.631064 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:07.630970 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:07.634138 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:07.634101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dba4c78e-e2a1-46fe-af9a-af4d512b4e4a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hhfz5\" (UID: \"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a\") " pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:07.893486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:07.893375 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hhfz5" Apr 22 18:45:08.011756 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:08.011725 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hhfz5"] Apr 22 18:45:08.014748 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:08.014712 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddba4c78e_e2a1_46fe_af9a_af4d512b4e4a.slice/crio-a952033d90b9484533fd7229ba7f6597374fd4adc9a6e3bc3ee5c82299a2cf1a WatchSource:0}: Error finding container a952033d90b9484533fd7229ba7f6597374fd4adc9a6e3bc3ee5c82299a2cf1a: Status 404 returned error can't find the container with id a952033d90b9484533fd7229ba7f6597374fd4adc9a6e3bc3ee5c82299a2cf1a Apr 22 18:45:08.772258 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:08.772223 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hhfz5" event={"ID":"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a","Type":"ContainerStarted","Data":"013b994c15c5613a43a8ffd4262b2c162de77eeca58a509bab59e31c3a4a3ea7"} Apr 22 18:45:08.772394 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:08.772264 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hhfz5" event={"ID":"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a","Type":"ContainerStarted","Data":"0f504a67df06e9f0fff5755bb9ac7f5d21e27317b812a5f5d1b8e07963eac4d8"} Apr 22 18:45:08.772394 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:08.772282 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hhfz5" event={"ID":"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a","Type":"ContainerStarted","Data":"a952033d90b9484533fd7229ba7f6597374fd4adc9a6e3bc3ee5c82299a2cf1a"} Apr 22 18:45:10.657229 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:10.657189 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:45:10.659558 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:10.659537 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") pod \"image-registry-6cd49db9df-9t66v\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:45:10.666456 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:10.666436 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:45:10.783065 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:10.783031 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hhfz5" event={"ID":"dba4c78e-e2a1-46fe-af9a-af4d512b4e4a","Type":"ContainerStarted","Data":"19c322d15f1f203941b5c22b32b61e3aaf13842ab1ae7ca8f66196923d0392a9"} Apr 22 18:45:10.791627 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:10.791598 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6cd49db9df-9t66v"] Apr 22 18:45:10.795015 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:10.794990 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23d4df5d_868e_4a93_8d62_72f442e82013.slice/crio-efb23ec66ef778a1270ab26985ad388309fb45497d9944b6765a34ff12248151 WatchSource:0}: Error finding container efb23ec66ef778a1270ab26985ad388309fb45497d9944b6765a34ff12248151: Status 404 returned error can't find the container with id efb23ec66ef778a1270ab26985ad388309fb45497d9944b6765a34ff12248151 Apr 22 18:45:10.819373 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:10.819313 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hhfz5" podStartSLOduration=9.72696017 podStartE2EDuration="11.819292813s" podCreationTimestamp="2026-04-22 18:44:59 +0000 UTC" firstStartedPulling="2026-04-22 18:45:08.071758632 +0000 UTC m=+171.359194787" lastFinishedPulling="2026-04-22 18:45:10.16409127 +0000 UTC m=+173.451527430" observedRunningTime="2026-04-22 18:45:10.818939479 +0000 UTC m=+174.106375659" watchObservedRunningTime="2026-04-22 18:45:10.819292813 +0000 UTC m=+174.106728992" Apr 22 18:45:11.786563 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:11.786521 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" event={"ID":"23d4df5d-868e-4a93-8d62-72f442e82013","Type":"ContainerStarted","Data":"63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920"} Apr 22 18:45:11.786563 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:11.786561 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" event={"ID":"23d4df5d-868e-4a93-8d62-72f442e82013","Type":"ContainerStarted","Data":"efb23ec66ef778a1270ab26985ad388309fb45497d9944b6765a34ff12248151"} Apr 22 18:45:11.811040 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:11.810988 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" podStartSLOduration=17.810971852 podStartE2EDuration="17.810971852s" podCreationTimestamp="2026-04-22 18:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:45:11.810039476 +0000 UTC m=+175.097475665" watchObservedRunningTime="2026-04-22 18:45:11.810971852 +0000 UTC m=+175.098408030" Apr 22 18:45:12.789565 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:12.789536 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:45:20.308420 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.308385 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6cd49db9df-9t66v"] Apr 22 18:45:20.383278 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.383246 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-84d66bbd7f-zv9dl"] Apr 22 18:45:20.386780 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.386763 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.408227 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.408196 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84d66bbd7f-zv9dl"] Apr 22 18:45:20.429067 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.429030 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e39c6e8-66e9-40ad-af3a-752c97681e94-bound-sa-token\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.429229 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.429122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e39c6e8-66e9-40ad-af3a-752c97681e94-registry-certificates\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.429229 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.429203 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5e39c6e8-66e9-40ad-af3a-752c97681e94-image-registry-private-configuration\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.429338 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.429231 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cl7z\" (UniqueName: \"kubernetes.io/projected/5e39c6e8-66e9-40ad-af3a-752c97681e94-kube-api-access-9cl7z\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.429338 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.429258 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e39c6e8-66e9-40ad-af3a-752c97681e94-trusted-ca\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.429338 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.429312 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e39c6e8-66e9-40ad-af3a-752c97681e94-ca-trust-extracted\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.429470 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.429349 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e39c6e8-66e9-40ad-af3a-752c97681e94-registry-tls\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.429470 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.429375 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e39c6e8-66e9-40ad-af3a-752c97681e94-installation-pull-secrets\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.530702 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.530668 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e39c6e8-66e9-40ad-af3a-752c97681e94-registry-certificates\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.530873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.530721 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5e39c6e8-66e9-40ad-af3a-752c97681e94-image-registry-private-configuration\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.530873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.530749 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cl7z\" (UniqueName: \"kubernetes.io/projected/5e39c6e8-66e9-40ad-af3a-752c97681e94-kube-api-access-9cl7z\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.530873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.530773 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e39c6e8-66e9-40ad-af3a-752c97681e94-trusted-ca\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.530873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.530797 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e39c6e8-66e9-40ad-af3a-752c97681e94-ca-trust-extracted\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.531081 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.530871 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e39c6e8-66e9-40ad-af3a-752c97681e94-registry-tls\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.531081 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.530912 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e39c6e8-66e9-40ad-af3a-752c97681e94-installation-pull-secrets\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.531081 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.530948 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e39c6e8-66e9-40ad-af3a-752c97681e94-bound-sa-token\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.531299 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.531273 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5e39c6e8-66e9-40ad-af3a-752c97681e94-ca-trust-extracted\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.531584 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.531564 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5e39c6e8-66e9-40ad-af3a-752c97681e94-registry-certificates\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.531909 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.531884 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e39c6e8-66e9-40ad-af3a-752c97681e94-trusted-ca\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.533420 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.533393 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5e39c6e8-66e9-40ad-af3a-752c97681e94-installation-pull-secrets\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.533528 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.533431 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/5e39c6e8-66e9-40ad-af3a-752c97681e94-image-registry-private-configuration\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.533528 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.533455 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5e39c6e8-66e9-40ad-af3a-752c97681e94-registry-tls\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.539911 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.539877 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cl7z\" (UniqueName: \"kubernetes.io/projected/5e39c6e8-66e9-40ad-af3a-752c97681e94-kube-api-access-9cl7z\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.541106 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.541081 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5e39c6e8-66e9-40ad-af3a-752c97681e94-bound-sa-token\") pod \"image-registry-84d66bbd7f-zv9dl\" (UID: \"5e39c6e8-66e9-40ad-af3a-752c97681e94\") " pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.696098 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.696010 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:20.818356 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:20.818327 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-84d66bbd7f-zv9dl"] Apr 22 18:45:20.821447 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:20.821419 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e39c6e8_66e9_40ad_af3a_752c97681e94.slice/crio-a5c110014a671a83194a961a748a7734971efef37b2ab76fd758151d0016019f WatchSource:0}: Error finding container a5c110014a671a83194a961a748a7734971efef37b2ab76fd758151d0016019f: Status 404 returned error can't find the container with id a5c110014a671a83194a961a748a7734971efef37b2ab76fd758151d0016019f Apr 22 18:45:21.816522 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:21.816463 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" event={"ID":"5e39c6e8-66e9-40ad-af3a-752c97681e94","Type":"ContainerStarted","Data":"f161a905d630908d0ede72e27ee6ab440755334087404b8bcf8ef0f426e088e1"} Apr 22 18:45:21.816522 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:21.816526 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" event={"ID":"5e39c6e8-66e9-40ad-af3a-752c97681e94","Type":"ContainerStarted","Data":"a5c110014a671a83194a961a748a7734971efef37b2ab76fd758151d0016019f"} Apr 22 18:45:21.816942 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:21.816585 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:21.868768 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:21.868722 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" podStartSLOduration=1.868706918 podStartE2EDuration="1.868706918s" podCreationTimestamp="2026-04-22 18:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:45:21.867889141 +0000 UTC m=+185.155325344" watchObservedRunningTime="2026-04-22 18:45:21.868706918 +0000 UTC m=+185.156143095" Apr 22 18:45:28.857784 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.857748 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94"] Apr 22 18:45:28.861064 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.861041 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:28.863813 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.863786 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:45:28.863932 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.863795 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:45:28.863996 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.863795 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:45:28.864190 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.864175 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:45:28.864930 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.864910 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-bmx7k\"" Apr 22 18:45:28.864930 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.864921 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:45:28.877295 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.877272 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94"] Apr 22 18:45:28.895488 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.895459 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-495hl"] Apr 22 18:45:28.897989 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.897965 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:28.900451 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.900419 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mnhfl\"" Apr 22 18:45:28.900451 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.900423 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:45:28.900635 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.900493 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:45:28.900894 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.900881 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:45:28.995152 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995115 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-textfile\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:28.995353 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995163 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f7befd2-b691-4c66-8a72-803d0d1ef203-metrics-client-ca\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:28.995353 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995196 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-wtmp\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:28.995353 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995254 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4259dbde-47d1-4b05-a472-b19c8b4af292-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:28.995353 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995288 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4259dbde-47d1-4b05-a472-b19c8b4af292-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:28.995353 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995315 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpf8n\" (UniqueName: \"kubernetes.io/projected/7f7befd2-b691-4c66-8a72-803d0d1ef203-kube-api-access-cpf8n\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:28.995353 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995346 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgztn\" (UniqueName: \"kubernetes.io/projected/4259dbde-47d1-4b05-a472-b19c8b4af292-kube-api-access-kgztn\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:28.995669 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995418 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4259dbde-47d1-4b05-a472-b19c8b4af292-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:28.995669 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995456 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-accelerators-collector-config\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:28.995669 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995486 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:28.995669 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995565 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7f7befd2-b691-4c66-8a72-803d0d1ef203-root\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:28.995669 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995619 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-tls\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:28.995846 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:28.995687 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f7befd2-b691-4c66-8a72-803d0d1ef203-sys\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.096453 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096416 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7f7befd2-b691-4c66-8a72-803d0d1ef203-root\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.096637 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096462 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-tls\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.096637 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096544 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/7f7befd2-b691-4c66-8a72-803d0d1ef203-root\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.096637 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096605 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f7befd2-b691-4c66-8a72-803d0d1ef203-sys\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.096781 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096637 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-textfile\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.096781 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f7befd2-b691-4c66-8a72-803d0d1ef203-metrics-client-ca\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.096781 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096708 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f7befd2-b691-4c66-8a72-803d0d1ef203-sys\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.096781 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096746 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-wtmp\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.097002 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096796 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4259dbde-47d1-4b05-a472-b19c8b4af292-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:29.097002 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4259dbde-47d1-4b05-a472-b19c8b4af292-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:29.097002 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096838 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpf8n\" (UniqueName: \"kubernetes.io/projected/7f7befd2-b691-4c66-8a72-803d0d1ef203-kube-api-access-cpf8n\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.097002 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096867 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgztn\" (UniqueName: \"kubernetes.io/projected/4259dbde-47d1-4b05-a472-b19c8b4af292-kube-api-access-kgztn\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:29.097002 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096906 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4259dbde-47d1-4b05-a472-b19c8b4af292-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:29.097002 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096939 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-accelerators-collector-config\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.097002 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.096969 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.097341 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.097200 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f7befd2-b691-4c66-8a72-803d0d1ef203-metrics-client-ca\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.097524 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.097468 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4259dbde-47d1-4b05-a472-b19c8b4af292-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:29.097654 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.097534 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-wtmp\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.097654 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.097627 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-textfile\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.097868 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.097845 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-accelerators-collector-config\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.099281 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.099253 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-tls\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.099572 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.099541 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4259dbde-47d1-4b05-a472-b19c8b4af292-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:29.099786 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.099765 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f7befd2-b691-4c66-8a72-803d0d1ef203-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.099836 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.099822 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4259dbde-47d1-4b05-a472-b19c8b4af292-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:29.108905 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.108848 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpf8n\" (UniqueName: \"kubernetes.io/projected/7f7befd2-b691-4c66-8a72-803d0d1ef203-kube-api-access-cpf8n\") pod \"node-exporter-495hl\" (UID: \"7f7befd2-b691-4c66-8a72-803d0d1ef203\") " pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.109015 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.109000 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgztn\" (UniqueName: \"kubernetes.io/projected/4259dbde-47d1-4b05-a472-b19c8b4af292-kube-api-access-kgztn\") pod \"openshift-state-metrics-9d44df66c-fkg94\" (UID: \"4259dbde-47d1-4b05-a472-b19c8b4af292\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:29.171001 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.170970 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" Apr 22 18:45:29.209233 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.209190 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-495hl" Apr 22 18:45:29.217870 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:29.217837 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f7befd2_b691_4c66_8a72_803d0d1ef203.slice/crio-1ae57f2ed1109f3e08c250cbd62e6f5f3b10206be6f3de0e6152c8622f7aec0b WatchSource:0}: Error finding container 1ae57f2ed1109f3e08c250cbd62e6f5f3b10206be6f3de0e6152c8622f7aec0b: Status 404 returned error can't find the container with id 1ae57f2ed1109f3e08c250cbd62e6f5f3b10206be6f3de0e6152c8622f7aec0b Apr 22 18:45:29.312057 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.312026 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94"] Apr 22 18:45:29.316672 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:29.316629 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4259dbde_47d1_4b05_a472_b19c8b4af292.slice/crio-ea71ecfeab6bdf055e13bf0004029272fc3f70f2f38191d23a9f9f3c2155f819 WatchSource:0}: Error finding container ea71ecfeab6bdf055e13bf0004029272fc3f70f2f38191d23a9f9f3c2155f819: Status 404 returned error can't find the container with id ea71ecfeab6bdf055e13bf0004029272fc3f70f2f38191d23a9f9f3c2155f819 Apr 22 18:45:29.840571 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.840528 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" event={"ID":"4259dbde-47d1-4b05-a472-b19c8b4af292","Type":"ContainerStarted","Data":"84f18bd8b5d9d2b3d38c7d4b041f5794323b53532cabc8a4942800314d01891b"} Apr 22 18:45:29.840571 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.840571 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" event={"ID":"4259dbde-47d1-4b05-a472-b19c8b4af292","Type":"ContainerStarted","Data":"adab94da61ad68fea323862180fef581c64a9512db4491434bf80de5b27bc00f"} Apr 22 18:45:29.840807 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.840585 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" event={"ID":"4259dbde-47d1-4b05-a472-b19c8b4af292","Type":"ContainerStarted","Data":"ea71ecfeab6bdf055e13bf0004029272fc3f70f2f38191d23a9f9f3c2155f819"} Apr 22 18:45:29.841856 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:29.841819 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-495hl" event={"ID":"7f7befd2-b691-4c66-8a72-803d0d1ef203","Type":"ContainerStarted","Data":"1ae57f2ed1109f3e08c250cbd62e6f5f3b10206be6f3de0e6152c8622f7aec0b"} Apr 22 18:45:30.315372 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.315330 2568 patch_prober.go:28] interesting pod/image-registry-6cd49db9df-9t66v container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:45:30.315802 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.315405 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" podUID="23d4df5d-868e-4a93-8d62-72f442e82013" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:30.846265 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.846229 2568 generic.go:358] "Generic (PLEG): container finished" podID="7f7befd2-b691-4c66-8a72-803d0d1ef203" containerID="d764ecdc29754d996f2e5e8d96da91ae2c0235ed5131a32b33c1971b95a0055c" exitCode=0 Apr 22 18:45:30.846574 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.846312 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-495hl" event={"ID":"7f7befd2-b691-4c66-8a72-803d0d1ef203","Type":"ContainerDied","Data":"d764ecdc29754d996f2e5e8d96da91ae2c0235ed5131a32b33c1971b95a0055c"} Apr 22 18:45:30.848051 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.848020 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" event={"ID":"4259dbde-47d1-4b05-a472-b19c8b4af292","Type":"ContainerStarted","Data":"f284028a0c4a754501c808921145d75f6b946846a7c769293515f36a32f93c5e"} Apr 22 18:45:30.887423 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.887395 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7bd7484947-44nwn"] Apr 22 18:45:30.890699 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.890683 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:30.893161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.893136 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 18:45:30.893279 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.893215 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 18:45:30.893279 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.893270 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-tkldt\"" Apr 22 18:45:30.893408 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.893216 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fagn202h6a4ao\"" Apr 22 18:45:30.893408 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.893222 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 18:45:30.893408 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.893269 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 18:45:30.893590 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.893462 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-fkg94" podStartSLOduration=1.9117752829999999 podStartE2EDuration="2.893451177s" podCreationTimestamp="2026-04-22 18:45:28 +0000 UTC" firstStartedPulling="2026-04-22 18:45:29.431355621 +0000 UTC m=+192.718791777" lastFinishedPulling="2026-04-22 18:45:30.413031499 +0000 UTC m=+193.700467671" observedRunningTime="2026-04-22 18:45:30.892169477 +0000 UTC m=+194.179605655" watchObservedRunningTime="2026-04-22 18:45:30.893451177 +0000 UTC m=+194.180887354" Apr 22 18:45:30.893656 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.893590 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 18:45:30.905557 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:30.905526 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7bd7484947-44nwn"] Apr 22 18:45:31.012735 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.012696 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.012923 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.012762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e36d6c01-6e8b-4001-8601-07ca12c15e68-metrics-client-ca\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.012923 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.012827 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.012923 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.012873 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cqpl\" (UniqueName: \"kubernetes.io/projected/e36d6c01-6e8b-4001-8601-07ca12c15e68-kube-api-access-7cqpl\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.012923 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.012902 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-tls\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.013130 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.012957 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.013130 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.013036 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.013130 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.013081 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-grpc-tls\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.114446 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.114334 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.114446 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.114426 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.114710 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.114465 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-grpc-tls\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.114710 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.114535 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.114710 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.114597 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e36d6c01-6e8b-4001-8601-07ca12c15e68-metrics-client-ca\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.114710 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.114621 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.114710 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.114648 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cqpl\" (UniqueName: \"kubernetes.io/projected/e36d6c01-6e8b-4001-8601-07ca12c15e68-kube-api-access-7cqpl\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.114710 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.114676 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-tls\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.115433 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.115406 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e36d6c01-6e8b-4001-8601-07ca12c15e68-metrics-client-ca\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.117274 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.117227 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.117274 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.117252 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-grpc-tls\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.117274 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.117227 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.117527 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.117493 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.117584 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.117560 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.117664 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.117650 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e36d6c01-6e8b-4001-8601-07ca12c15e68-secret-thanos-querier-tls\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.127556 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.127529 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cqpl\" (UniqueName: \"kubernetes.io/projected/e36d6c01-6e8b-4001-8601-07ca12c15e68-kube-api-access-7cqpl\") pod \"thanos-querier-7bd7484947-44nwn\" (UID: \"e36d6c01-6e8b-4001-8601-07ca12c15e68\") " pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.199995 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.199951 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:31.323464 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.323439 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7bd7484947-44nwn"] Apr 22 18:45:31.325822 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:31.325796 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode36d6c01_6e8b_4001_8601_07ca12c15e68.slice/crio-420ad0952e89b55600bb16e371c0241ea720a90bca5a1a4131b8368d2d97fd03 WatchSource:0}: Error finding container 420ad0952e89b55600bb16e371c0241ea720a90bca5a1a4131b8368d2d97fd03: Status 404 returned error can't find the container with id 420ad0952e89b55600bb16e371c0241ea720a90bca5a1a4131b8368d2d97fd03 Apr 22 18:45:31.853536 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.853405 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" event={"ID":"e36d6c01-6e8b-4001-8601-07ca12c15e68","Type":"ContainerStarted","Data":"420ad0952e89b55600bb16e371c0241ea720a90bca5a1a4131b8368d2d97fd03"} Apr 22 18:45:31.856285 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.856225 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-495hl" event={"ID":"7f7befd2-b691-4c66-8a72-803d0d1ef203","Type":"ContainerStarted","Data":"6f016677353fdad5d3798f5e0bdc56ceff84a6e29e00d9e2395eb3d772328685"} Apr 22 18:45:31.856285 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.856275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-495hl" event={"ID":"7f7befd2-b691-4c66-8a72-803d0d1ef203","Type":"ContainerStarted","Data":"099aadc16797bb9d4c0ec048227cf95622d3613e630035167e7dad8807b6d2e8"} Apr 22 18:45:31.879442 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:31.879392 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-495hl" podStartSLOduration=3.037427672 podStartE2EDuration="3.879377372s" podCreationTimestamp="2026-04-22 18:45:28 +0000 UTC" firstStartedPulling="2026-04-22 18:45:29.219922402 +0000 UTC m=+192.507358566" lastFinishedPulling="2026-04-22 18:45:30.061872098 +0000 UTC m=+193.349308266" observedRunningTime="2026-04-22 18:45:31.877222685 +0000 UTC m=+195.164658898" watchObservedRunningTime="2026-04-22 18:45:31.879377372 +0000 UTC m=+195.166813586" Apr 22 18:45:33.412565 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.412536 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-95b554d55-6w22l"] Apr 22 18:45:33.415382 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.415358 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.419550 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.419529 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:45:33.420389 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.420369 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-cg97ejfpqdl6p\"" Apr 22 18:45:33.420517 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.420466 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-c5rck\"" Apr 22 18:45:33.420790 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.420734 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:45:33.420790 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.420736 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:45:33.420943 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.420815 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:45:33.426182 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.426161 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-95b554d55-6w22l"] Apr 22 18:45:33.534461 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.534362 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/32fb26b5-bd68-4510-9211-df5d18a08f2a-metrics-server-audit-profiles\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.534461 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.534443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/32fb26b5-bd68-4510-9211-df5d18a08f2a-secret-metrics-server-client-certs\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.534888 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.534541 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/32fb26b5-bd68-4510-9211-df5d18a08f2a-audit-log\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.534888 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.534576 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/32fb26b5-bd68-4510-9211-df5d18a08f2a-secret-metrics-server-tls\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.534888 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.534606 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32fb26b5-bd68-4510-9211-df5d18a08f2a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.534888 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.534635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32fb26b5-bd68-4510-9211-df5d18a08f2a-client-ca-bundle\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.534888 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.534652 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl47d\" (UniqueName: \"kubernetes.io/projected/32fb26b5-bd68-4510-9211-df5d18a08f2a-kube-api-access-sl47d\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.635748 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.635711 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/32fb26b5-bd68-4510-9211-df5d18a08f2a-secret-metrics-server-client-certs\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.635920 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.635758 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/32fb26b5-bd68-4510-9211-df5d18a08f2a-audit-log\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.635920 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.635883 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/32fb26b5-bd68-4510-9211-df5d18a08f2a-secret-metrics-server-tls\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.635920 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.635917 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32fb26b5-bd68-4510-9211-df5d18a08f2a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.636053 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.635950 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32fb26b5-bd68-4510-9211-df5d18a08f2a-client-ca-bundle\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.636142 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.636108 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl47d\" (UniqueName: \"kubernetes.io/projected/32fb26b5-bd68-4510-9211-df5d18a08f2a-kube-api-access-sl47d\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.636440 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.636155 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/32fb26b5-bd68-4510-9211-df5d18a08f2a-audit-log\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.636440 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.636334 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/32fb26b5-bd68-4510-9211-df5d18a08f2a-metrics-server-audit-profiles\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.636673 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.636650 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32fb26b5-bd68-4510-9211-df5d18a08f2a-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.637982 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.637957 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/32fb26b5-bd68-4510-9211-df5d18a08f2a-metrics-server-audit-profiles\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.638613 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.638385 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/32fb26b5-bd68-4510-9211-df5d18a08f2a-secret-metrics-server-client-certs\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.638613 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.638529 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/32fb26b5-bd68-4510-9211-df5d18a08f2a-secret-metrics-server-tls\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.638613 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.638548 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32fb26b5-bd68-4510-9211-df5d18a08f2a-client-ca-bundle\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.647803 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.647783 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx"] Apr 22 18:45:33.650671 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.650655 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" Apr 22 18:45:33.653119 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.653097 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl47d\" (UniqueName: \"kubernetes.io/projected/32fb26b5-bd68-4510-9211-df5d18a08f2a-kube-api-access-sl47d\") pod \"metrics-server-95b554d55-6w22l\" (UID: \"32fb26b5-bd68-4510-9211-df5d18a08f2a\") " pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.653632 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.653615 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-fjqwd\"" Apr 22 18:45:33.653846 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.653832 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:45:33.670568 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.670546 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx"] Apr 22 18:45:33.732407 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.732370 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:33.737361 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.737334 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8f6b62ea-4987-4f68-9fbb-e65c98816700-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fnqsx\" (UID: \"8f6b62ea-4987-4f68-9fbb-e65c98816700\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" Apr 22 18:45:33.838300 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.838270 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8f6b62ea-4987-4f68-9fbb-e65c98816700-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fnqsx\" (UID: \"8f6b62ea-4987-4f68-9fbb-e65c98816700\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" Apr 22 18:45:33.838473 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:33.838450 2568 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 18:45:33.838559 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:33.838547 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f6b62ea-4987-4f68-9fbb-e65c98816700-monitoring-plugin-cert podName:8f6b62ea-4987-4f68-9fbb-e65c98816700 nodeName:}" failed. No retries permitted until 2026-04-22 18:45:34.33852483 +0000 UTC m=+197.625960988 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/8f6b62ea-4987-4f68-9fbb-e65c98816700-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-fnqsx" (UID: "8f6b62ea-4987-4f68-9fbb-e65c98816700") : secret "monitoring-plugin-cert" not found Apr 22 18:45:33.852474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.852445 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-95b554d55-6w22l"] Apr 22 18:45:33.855468 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:33.855444 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32fb26b5_bd68_4510_9211_df5d18a08f2a.slice/crio-a8b1957c3119617090767465b58c005808714bc187de99519b244e2c7b02a2e2 WatchSource:0}: Error finding container a8b1957c3119617090767465b58c005808714bc187de99519b244e2c7b02a2e2: Status 404 returned error can't find the container with id a8b1957c3119617090767465b58c005808714bc187de99519b244e2c7b02a2e2 Apr 22 18:45:33.862012 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.861988 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-95b554d55-6w22l" event={"ID":"32fb26b5-bd68-4510-9211-df5d18a08f2a","Type":"ContainerStarted","Data":"a8b1957c3119617090767465b58c005808714bc187de99519b244e2c7b02a2e2"} Apr 22 18:45:33.863886 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.863863 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" event={"ID":"e36d6c01-6e8b-4001-8601-07ca12c15e68","Type":"ContainerStarted","Data":"299638d04f08991bb4f31fed88d62567681c0bf822d64e73cf8dae3988094ddd"} Apr 22 18:45:33.863962 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.863891 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" event={"ID":"e36d6c01-6e8b-4001-8601-07ca12c15e68","Type":"ContainerStarted","Data":"29bbcde8f7156420b2e1a65bbbbd23cfb93519577bde5a5863ebd2ef5b1c4362"} Apr 22 18:45:33.863962 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:33.863901 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" event={"ID":"e36d6c01-6e8b-4001-8601-07ca12c15e68","Type":"ContainerStarted","Data":"2062ef632f9f685f683c5d0f43eadd29bf59b0f0f4e8d6384f171b9de40a836e"} Apr 22 18:45:34.125427 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.125337 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2"] Apr 22 18:45:34.128210 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.128187 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.130809 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.130765 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:45:34.130930 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.130912 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:45:34.131172 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.131007 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:45:34.131172 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.131078 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:45:34.131172 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.131136 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:45:34.131543 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.131525 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-6k74r\"" Apr 22 18:45:34.135945 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.135924 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:45:34.143468 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.143443 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2"] Apr 22 18:45:34.242098 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.242068 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5153a39e-9b48-415d-8a21-4879a686d80b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.242206 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.242128 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-secret-telemeter-client\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.242206 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.242165 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5153a39e-9b48-415d-8a21-4879a686d80b-serving-certs-ca-bundle\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.242206 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.242188 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.242377 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.242220 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqq4c\" (UniqueName: \"kubernetes.io/projected/5153a39e-9b48-415d-8a21-4879a686d80b-kube-api-access-cqq4c\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.242377 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.242310 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5153a39e-9b48-415d-8a21-4879a686d80b-metrics-client-ca\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.242489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.242388 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-federate-client-tls\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.242489 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.242416 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-telemeter-client-tls\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.343284 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.343256 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5153a39e-9b48-415d-8a21-4879a686d80b-metrics-client-ca\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.343409 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.343310 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-federate-client-tls\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.343409 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.343332 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-telemeter-client-tls\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.343409 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.343404 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5153a39e-9b48-415d-8a21-4879a686d80b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.343643 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.343444 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-secret-telemeter-client\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.343643 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.343480 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5153a39e-9b48-415d-8a21-4879a686d80b-serving-certs-ca-bundle\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.343643 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.343528 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.343643 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.343563 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqq4c\" (UniqueName: \"kubernetes.io/projected/5153a39e-9b48-415d-8a21-4879a686d80b-kube-api-access-cqq4c\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.343643 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.343597 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8f6b62ea-4987-4f68-9fbb-e65c98816700-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fnqsx\" (UID: \"8f6b62ea-4987-4f68-9fbb-e65c98816700\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" Apr 22 18:45:34.344542 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.344468 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5153a39e-9b48-415d-8a21-4879a686d80b-metrics-client-ca\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.344665 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.344640 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5153a39e-9b48-415d-8a21-4879a686d80b-serving-certs-ca-bundle\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.345284 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.345257 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5153a39e-9b48-415d-8a21-4879a686d80b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.346592 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.346516 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8f6b62ea-4987-4f68-9fbb-e65c98816700-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-fnqsx\" (UID: \"8f6b62ea-4987-4f68-9fbb-e65c98816700\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" Apr 22 18:45:34.347117 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.346859 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-telemeter-client-tls\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.347117 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.347088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-secret-telemeter-client\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.347245 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.347090 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-federate-client-tls\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.348005 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.347986 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5153a39e-9b48-415d-8a21-4879a686d80b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.355312 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.355281 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqq4c\" (UniqueName: \"kubernetes.io/projected/5153a39e-9b48-415d-8a21-4879a686d80b-kube-api-access-cqq4c\") pod \"telemeter-client-7bb9d8c5b9-s7rr2\" (UID: \"5153a39e-9b48-415d-8a21-4879a686d80b\") " pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.439655 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.439625 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" Apr 22 18:45:34.559242 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.559212 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2"] Apr 22 18:45:34.559390 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.559256 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" Apr 22 18:45:34.561966 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:34.561938 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5153a39e_9b48_415d_8a21_4879a686d80b.slice/crio-b39a50a4215da45fc6b45d9926adc5edd01664bf3a930bca36c9a97e5f0d80ff WatchSource:0}: Error finding container b39a50a4215da45fc6b45d9926adc5edd01664bf3a930bca36c9a97e5f0d80ff: Status 404 returned error can't find the container with id b39a50a4215da45fc6b45d9926adc5edd01664bf3a930bca36c9a97e5f0d80ff Apr 22 18:45:34.679583 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.679450 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx"] Apr 22 18:45:34.682470 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:34.682444 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f6b62ea_4987_4f68_9fbb_e65c98816700.slice/crio-97f4798252f519949414d827181acff01101b6e91f8d6f27e16ab8c2870884b8 WatchSource:0}: Error finding container 97f4798252f519949414d827181acff01101b6e91f8d6f27e16ab8c2870884b8: Status 404 returned error can't find the container with id 97f4798252f519949414d827181acff01101b6e91f8d6f27e16ab8c2870884b8 Apr 22 18:45:34.869407 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.869365 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" event={"ID":"e36d6c01-6e8b-4001-8601-07ca12c15e68","Type":"ContainerStarted","Data":"bae798d30d546eeca01924d80dfbd43244b94eda0c6656c7a45d65a7fcd0f35d"} Apr 22 18:45:34.869407 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.869409 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" event={"ID":"e36d6c01-6e8b-4001-8601-07ca12c15e68","Type":"ContainerStarted","Data":"1470031ab4ccbc094d55553797fac94a97af3fa3aed3e531e63d30f4e318879a"} Apr 22 18:45:34.869641 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.869424 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" event={"ID":"e36d6c01-6e8b-4001-8601-07ca12c15e68","Type":"ContainerStarted","Data":"c0182dbd7f56b6fbb97c3a9673ff24c4eb0dbf375ca0f153459a66af42930d56"} Apr 22 18:45:34.869641 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.869542 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:34.870367 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.870336 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" event={"ID":"8f6b62ea-4987-4f68-9fbb-e65c98816700","Type":"ContainerStarted","Data":"97f4798252f519949414d827181acff01101b6e91f8d6f27e16ab8c2870884b8"} Apr 22 18:45:34.871188 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.871171 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" event={"ID":"5153a39e-9b48-415d-8a21-4879a686d80b","Type":"ContainerStarted","Data":"b39a50a4215da45fc6b45d9926adc5edd01664bf3a930bca36c9a97e5f0d80ff"} Apr 22 18:45:34.904022 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:34.903971 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" podStartSLOduration=2.005913404 podStartE2EDuration="4.903956397s" podCreationTimestamp="2026-04-22 18:45:30 +0000 UTC" firstStartedPulling="2026-04-22 18:45:31.327726499 +0000 UTC m=+194.615162655" lastFinishedPulling="2026-04-22 18:45:34.225769483 +0000 UTC m=+197.513205648" observedRunningTime="2026-04-22 18:45:34.903688733 +0000 UTC m=+198.191124910" watchObservedRunningTime="2026-04-22 18:45:34.903956397 +0000 UTC m=+198.191392575" Apr 22 18:45:36.879123 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:36.879083 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" event={"ID":"5153a39e-9b48-415d-8a21-4879a686d80b","Type":"ContainerStarted","Data":"00960875e0e8cc500387baaa2cfa629c491c4893bf9ee449dcf9b056c49b6d8a"} Apr 22 18:45:36.880692 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:36.880655 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-95b554d55-6w22l" event={"ID":"32fb26b5-bd68-4510-9211-df5d18a08f2a","Type":"ContainerStarted","Data":"74b5089ce91a22116e48cf3c7c5f87b1acd380755e6c282dd369174b7a56bd3a"} Apr 22 18:45:36.900943 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:36.900899 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-95b554d55-6w22l" podStartSLOduration=1.245806498 podStartE2EDuration="3.900885996s" podCreationTimestamp="2026-04-22 18:45:33 +0000 UTC" firstStartedPulling="2026-04-22 18:45:33.857762237 +0000 UTC m=+197.145198393" lastFinishedPulling="2026-04-22 18:45:36.512841733 +0000 UTC m=+199.800277891" observedRunningTime="2026-04-22 18:45:36.900421251 +0000 UTC m=+200.187857488" watchObservedRunningTime="2026-04-22 18:45:36.900885996 +0000 UTC m=+200.188322174" Apr 22 18:45:37.884210 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:37.884125 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" event={"ID":"8f6b62ea-4987-4f68-9fbb-e65c98816700","Type":"ContainerStarted","Data":"67ae5837ab025143af7398735bc5c225e312160110f26f31cff19261ba2204ea"} Apr 22 18:45:37.884681 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:37.884311 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" Apr 22 18:45:37.886216 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:37.886181 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" event={"ID":"5153a39e-9b48-415d-8a21-4879a686d80b","Type":"ContainerStarted","Data":"cb492bccfeee616fe5d47dd2f4af2073052fd0acd496cc00419cd98e092f41c5"} Apr 22 18:45:37.886216 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:37.886217 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" event={"ID":"5153a39e-9b48-415d-8a21-4879a686d80b","Type":"ContainerStarted","Data":"4622d848f5fae1c2071af1064f26e19c80d0f241d85e01c2967e721058ac319b"} Apr 22 18:45:37.889478 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:37.889458 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" Apr 22 18:45:37.907096 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:37.907043 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-fnqsx" podStartSLOduration=2.028518243 podStartE2EDuration="4.907027433s" podCreationTimestamp="2026-04-22 18:45:33 +0000 UTC" firstStartedPulling="2026-04-22 18:45:34.68476871 +0000 UTC m=+197.972204866" lastFinishedPulling="2026-04-22 18:45:37.563277898 +0000 UTC m=+200.850714056" observedRunningTime="2026-04-22 18:45:37.905996925 +0000 UTC m=+201.193433127" watchObservedRunningTime="2026-04-22 18:45:37.907027433 +0000 UTC m=+201.194463611" Apr 22 18:45:37.937134 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:37.937082 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7bb9d8c5b9-s7rr2" podStartSLOduration=0.937232053 podStartE2EDuration="3.937067986s" podCreationTimestamp="2026-04-22 18:45:34 +0000 UTC" firstStartedPulling="2026-04-22 18:45:34.564268158 +0000 UTC m=+197.851704314" lastFinishedPulling="2026-04-22 18:45:37.564104091 +0000 UTC m=+200.851540247" observedRunningTime="2026-04-22 18:45:37.935405909 +0000 UTC m=+201.222842087" watchObservedRunningTime="2026-04-22 18:45:37.937067986 +0000 UTC m=+201.224504164" Apr 22 18:45:40.314791 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:40.314757 2568 patch_prober.go:28] interesting pod/image-registry-6cd49db9df-9t66v container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:45:40.315230 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:40.314825 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" podUID="23d4df5d-868e-4a93-8d62-72f442e82013" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:40.700565 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:40.700476 2568 patch_prober.go:28] interesting pod/image-registry-84d66bbd7f-zv9dl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:45:40.700565 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:40.700547 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" podUID="5e39c6e8-66e9-40ad-af3a-752c97681e94" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:40.882285 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:40.882258 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7bd7484947-44nwn" Apr 22 18:45:42.825492 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:42.825451 2568 patch_prober.go:28] interesting pod/image-registry-84d66bbd7f-zv9dl container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:45:42.825932 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:42.825526 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" podUID="5e39c6e8-66e9-40ad-af3a-752c97681e94" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:45.327453 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.327415 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" podUID="23d4df5d-868e-4a93-8d62-72f442e82013" containerName="registry" containerID="cri-o://63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920" gracePeriod=30 Apr 22 18:45:45.562840 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.562813 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:45:45.644364 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.644279 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-bound-sa-token\") pod \"23d4df5d-868e-4a93-8d62-72f442e82013\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " Apr 22 18:45:45.644364 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.644318 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23d4df5d-868e-4a93-8d62-72f442e82013-ca-trust-extracted\") pod \"23d4df5d-868e-4a93-8d62-72f442e82013\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " Apr 22 18:45:45.644364 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.644355 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-registry-certificates\") pod \"23d4df5d-868e-4a93-8d62-72f442e82013\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " Apr 22 18:45:45.644639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.644375 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") pod \"23d4df5d-868e-4a93-8d62-72f442e82013\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " Apr 22 18:45:45.644639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.644444 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-image-registry-private-configuration\") pod \"23d4df5d-868e-4a93-8d62-72f442e82013\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " Apr 22 18:45:45.644639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.644482 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9kvf\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-kube-api-access-p9kvf\") pod \"23d4df5d-868e-4a93-8d62-72f442e82013\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " Apr 22 18:45:45.644639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.644525 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-installation-pull-secrets\") pod \"23d4df5d-868e-4a93-8d62-72f442e82013\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " Apr 22 18:45:45.644639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.644548 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-trusted-ca\") pod \"23d4df5d-868e-4a93-8d62-72f442e82013\" (UID: \"23d4df5d-868e-4a93-8d62-72f442e82013\") " Apr 22 18:45:45.644884 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.644851 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "23d4df5d-868e-4a93-8d62-72f442e82013" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:45.645213 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.645187 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "23d4df5d-868e-4a93-8d62-72f442e82013" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:45:45.647194 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.647161 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "23d4df5d-868e-4a93-8d62-72f442e82013" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:45:45.647306 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.647194 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "23d4df5d-868e-4a93-8d62-72f442e82013" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:45:45.647306 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.647199 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "23d4df5d-868e-4a93-8d62-72f442e82013" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:45.647306 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.647241 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-kube-api-access-p9kvf" (OuterVolumeSpecName: "kube-api-access-p9kvf") pod "23d4df5d-868e-4a93-8d62-72f442e82013" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013"). InnerVolumeSpecName "kube-api-access-p9kvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:45.647410 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.647396 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "23d4df5d-868e-4a93-8d62-72f442e82013" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:45.652956 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.652934 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23d4df5d-868e-4a93-8d62-72f442e82013-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "23d4df5d-868e-4a93-8d62-72f442e82013" (UID: "23d4df5d-868e-4a93-8d62-72f442e82013"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:45:45.746269 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.746216 2568 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-image-registry-private-configuration\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:45:45.746269 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.746266 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p9kvf\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-kube-api-access-p9kvf\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:45:45.746269 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.746276 2568 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/23d4df5d-868e-4a93-8d62-72f442e82013-installation-pull-secrets\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:45:45.746269 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.746286 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-trusted-ca\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:45:45.746549 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.746296 2568 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-bound-sa-token\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:45:45.746549 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.746305 2568 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/23d4df5d-868e-4a93-8d62-72f442e82013-ca-trust-extracted\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:45:45.746549 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.746314 2568 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/23d4df5d-868e-4a93-8d62-72f442e82013-registry-certificates\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:45:45.746549 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.746323 2568 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/23d4df5d-868e-4a93-8d62-72f442e82013-registry-tls\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:45:45.912824 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.912740 2568 generic.go:358] "Generic (PLEG): container finished" podID="23d4df5d-868e-4a93-8d62-72f442e82013" containerID="63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920" exitCode=0 Apr 22 18:45:45.912824 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.912794 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" Apr 22 18:45:45.912824 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.912818 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" event={"ID":"23d4df5d-868e-4a93-8d62-72f442e82013","Type":"ContainerDied","Data":"63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920"} Apr 22 18:45:45.913041 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.912843 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6cd49db9df-9t66v" event={"ID":"23d4df5d-868e-4a93-8d62-72f442e82013","Type":"ContainerDied","Data":"efb23ec66ef778a1270ab26985ad388309fb45497d9944b6765a34ff12248151"} Apr 22 18:45:45.913041 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.912860 2568 scope.go:117] "RemoveContainer" containerID="63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920" Apr 22 18:45:45.920951 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.920936 2568 scope.go:117] "RemoveContainer" containerID="63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920" Apr 22 18:45:45.921206 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:45:45.921187 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920\": container with ID starting with 63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920 not found: ID does not exist" containerID="63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920" Apr 22 18:45:45.921266 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.921218 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920"} err="failed to get container status \"63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920\": rpc error: code = NotFound desc = could not find container \"63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920\": container with ID starting with 63ab6e2bcc243f23b976d033fd3e27ed1c53ec07c124cf3c5f611282ab4bf920 not found: ID does not exist" Apr 22 18:45:45.934039 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.934013 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6cd49db9df-9t66v"] Apr 22 18:45:45.939977 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:45.939954 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6cd49db9df-9t66v"] Apr 22 18:45:46.929028 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:46.928993 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-zchbz"] Apr 22 18:45:46.929406 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:46.929371 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23d4df5d-868e-4a93-8d62-72f442e82013" containerName="registry" Apr 22 18:45:46.929406 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:46.929387 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d4df5d-868e-4a93-8d62-72f442e82013" containerName="registry" Apr 22 18:45:46.929473 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:46.929436 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="23d4df5d-868e-4a93-8d62-72f442e82013" containerName="registry" Apr 22 18:45:46.934368 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:46.934349 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zchbz" Apr 22 18:45:46.942135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:46.942116 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:45:46.942919 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:46.942905 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-q75b5\"" Apr 22 18:45:46.943144 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:46.943128 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:45:46.953792 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:46.953769 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zchbz"] Apr 22 18:45:47.058904 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:47.058860 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k429s\" (UniqueName: \"kubernetes.io/projected/d47db21d-84ef-412a-8e2e-dd9b3fc6599b-kube-api-access-k429s\") pod \"downloads-6bcc868b7-zchbz\" (UID: \"d47db21d-84ef-412a-8e2e-dd9b3fc6599b\") " pod="openshift-console/downloads-6bcc868b7-zchbz" Apr 22 18:45:47.159860 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:47.159826 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k429s\" (UniqueName: \"kubernetes.io/projected/d47db21d-84ef-412a-8e2e-dd9b3fc6599b-kube-api-access-k429s\") pod \"downloads-6bcc868b7-zchbz\" (UID: \"d47db21d-84ef-412a-8e2e-dd9b3fc6599b\") " pod="openshift-console/downloads-6bcc868b7-zchbz" Apr 22 18:45:47.169349 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:47.169321 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k429s\" (UniqueName: \"kubernetes.io/projected/d47db21d-84ef-412a-8e2e-dd9b3fc6599b-kube-api-access-k429s\") pod \"downloads-6bcc868b7-zchbz\" (UID: \"d47db21d-84ef-412a-8e2e-dd9b3fc6599b\") " pod="openshift-console/downloads-6bcc868b7-zchbz" Apr 22 18:45:47.242578 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:47.242483 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zchbz" Apr 22 18:45:47.307138 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:47.307110 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23d4df5d-868e-4a93-8d62-72f442e82013" path="/var/lib/kubelet/pods/23d4df5d-868e-4a93-8d62-72f442e82013/volumes" Apr 22 18:45:47.365332 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:47.365297 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zchbz"] Apr 22 18:45:47.368015 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:47.367991 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd47db21d_84ef_412a_8e2e_dd9b3fc6599b.slice/crio-d2afd9e5bad52c5a7cb0ff9f061e7ace97cb8c550355260cc5e20e5c43cadbe0 WatchSource:0}: Error finding container d2afd9e5bad52c5a7cb0ff9f061e7ace97cb8c550355260cc5e20e5c43cadbe0: Status 404 returned error can't find the container with id d2afd9e5bad52c5a7cb0ff9f061e7ace97cb8c550355260cc5e20e5c43cadbe0 Apr 22 18:45:47.920200 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:47.920159 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zchbz" event={"ID":"d47db21d-84ef-412a-8e2e-dd9b3fc6599b","Type":"ContainerStarted","Data":"d2afd9e5bad52c5a7cb0ff9f061e7ace97cb8c550355260cc5e20e5c43cadbe0"} Apr 22 18:45:50.700865 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:50.700828 2568 patch_prober.go:28] interesting pod/image-registry-84d66bbd7f-zv9dl container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:45:50.701345 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:50.700890 2568 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" podUID="5e39c6e8-66e9-40ad-af3a-752c97681e94" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:45:52.826162 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:52.826135 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-84d66bbd7f-zv9dl" Apr 22 18:45:53.732711 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:53.732669 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:53.732711 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:53.732721 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:45:55.494937 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.494900 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-547d4f7998-lnfs2"] Apr 22 18:45:55.498765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.498741 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.502530 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.502478 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 18:45:55.502654 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.502530 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 18:45:55.502654 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.502563 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 18:45:55.502654 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.502593 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 18:45:55.502654 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.502488 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-4t2nf\"" Apr 22 18:45:55.503029 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.502999 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 18:45:55.509859 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.509836 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547d4f7998-lnfs2"] Apr 22 18:45:55.636210 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.636162 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-oauth-serving-cert\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.636336 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.636278 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-config\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.636408 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.636367 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-oauth-config\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.636476 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.636436 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr7pk\" (UniqueName: \"kubernetes.io/projected/135bb52e-eb9e-4324-b484-4d2a7afbd52b-kube-api-access-cr7pk\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.636556 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.636474 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-service-ca\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.636556 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.636543 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-serving-cert\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.737610 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.737569 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-oauth-config\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.737800 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.737657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cr7pk\" (UniqueName: \"kubernetes.io/projected/135bb52e-eb9e-4324-b484-4d2a7afbd52b-kube-api-access-cr7pk\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.737868 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.737805 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-service-ca\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.737868 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.737853 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-serving-cert\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.737971 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.737914 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-oauth-serving-cert\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.737971 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.737956 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-config\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.738594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.738567 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-service-ca\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.738693 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.738633 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-config\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.738752 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.738727 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-oauth-serving-cert\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.740689 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.740665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-oauth-config\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.740793 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.740737 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-serving-cert\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.746590 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.746539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cr7pk\" (UniqueName: \"kubernetes.io/projected/135bb52e-eb9e-4324-b484-4d2a7afbd52b-kube-api-access-cr7pk\") pod \"console-547d4f7998-lnfs2\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.810555 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.810516 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:45:55.944259 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.944226 2568 generic.go:358] "Generic (PLEG): container finished" podID="b8003c92-2bc0-4825-974f-12470b332830" containerID="4c2770a5e14b94dc933f7febba5bfdd712f6ca4336a1bc5f49d096c097d7e09c" exitCode=0 Apr 22 18:45:55.944438 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.944285 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zw28s" event={"ID":"b8003c92-2bc0-4825-974f-12470b332830","Type":"ContainerDied","Data":"4c2770a5e14b94dc933f7febba5bfdd712f6ca4336a1bc5f49d096c097d7e09c"} Apr 22 18:45:55.944740 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.944715 2568 scope.go:117] "RemoveContainer" containerID="4c2770a5e14b94dc933f7febba5bfdd712f6ca4336a1bc5f49d096c097d7e09c" Apr 22 18:45:55.950002 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:55.949978 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-547d4f7998-lnfs2"] Apr 22 18:45:55.954478 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:45:55.954440 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod135bb52e_eb9e_4324_b484_4d2a7afbd52b.slice/crio-bd3faa5a25c63e48b3292cba19ed47c03f13a33b223a0a88f804c8b5cf8e7bf1 WatchSource:0}: Error finding container bd3faa5a25c63e48b3292cba19ed47c03f13a33b223a0a88f804c8b5cf8e7bf1: Status 404 returned error can't find the container with id bd3faa5a25c63e48b3292cba19ed47c03f13a33b223a0a88f804c8b5cf8e7bf1 Apr 22 18:45:56.950384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:56.950347 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-zw28s" event={"ID":"b8003c92-2bc0-4825-974f-12470b332830","Type":"ContainerStarted","Data":"697fe05cbefd5ca9b9f27f751702b10dd4dfe22d46cd5e048ed0f4936a970cd2"} Apr 22 18:45:56.952140 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:56.952106 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547d4f7998-lnfs2" event={"ID":"135bb52e-eb9e-4324-b484-4d2a7afbd52b","Type":"ContainerStarted","Data":"bd3faa5a25c63e48b3292cba19ed47c03f13a33b223a0a88f804c8b5cf8e7bf1"} Apr 22 18:45:58.513548 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:45:58.513516 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hn296_7b2ce3e8-c29b-4c51-bc68-022864d2d2fa/dns-node-resolver/0.log" Apr 22 18:46:02.009336 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.009300 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56cd89967c-2mf8r"] Apr 22 18:46:02.015078 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.015048 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.025256 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.025212 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 18:46:02.025956 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.025936 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56cd89967c-2mf8r"] Apr 22 18:46:02.092572 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.092538 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rqql\" (UniqueName: \"kubernetes.io/projected/42e9a5ee-55d6-437e-9c9d-511bd4d82837-kube-api-access-9rqql\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.092853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.092586 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-config\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.092853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.092609 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-oauth-serving-cert\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.092853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.092733 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-oauth-config\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.092853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.092771 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-trusted-ca-bundle\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.092853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.092832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-serving-cert\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.093121 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.092872 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-service-ca\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.193665 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.193625 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-serving-cert\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.193960 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.193683 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-service-ca\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.193960 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.193776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rqql\" (UniqueName: \"kubernetes.io/projected/42e9a5ee-55d6-437e-9c9d-511bd4d82837-kube-api-access-9rqql\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.193960 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.193806 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-config\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.193960 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.193829 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-oauth-serving-cert\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.193960 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.193868 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-oauth-config\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.193960 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.193897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-trusted-ca-bundle\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.194750 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.194717 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-service-ca\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.194944 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.194894 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-config\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.195049 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.195010 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-trusted-ca-bundle\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.195187 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.195161 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-oauth-serving-cert\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.196490 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.196467 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-serving-cert\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.196598 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.196566 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-oauth-config\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.203077 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.203052 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rqql\" (UniqueName: \"kubernetes.io/projected/42e9a5ee-55d6-437e-9c9d-511bd4d82837-kube-api-access-9rqql\") pod \"console-56cd89967c-2mf8r\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:02.327533 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:02.327357 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:05.454836 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:05.454401 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56cd89967c-2mf8r"] Apr 22 18:46:05.457369 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:46:05.457331 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e9a5ee_55d6_437e_9c9d_511bd4d82837.slice/crio-461fa0f5c4f73ba78f95eccbf003050a2de8b8d7017cc8f773e2597d5cbcf3a8 WatchSource:0}: Error finding container 461fa0f5c4f73ba78f95eccbf003050a2de8b8d7017cc8f773e2597d5cbcf3a8: Status 404 returned error can't find the container with id 461fa0f5c4f73ba78f95eccbf003050a2de8b8d7017cc8f773e2597d5cbcf3a8 Apr 22 18:46:05.985203 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:05.985150 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547d4f7998-lnfs2" event={"ID":"135bb52e-eb9e-4324-b484-4d2a7afbd52b","Type":"ContainerStarted","Data":"d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4"} Apr 22 18:46:05.986853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:05.986819 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cd89967c-2mf8r" event={"ID":"42e9a5ee-55d6-437e-9c9d-511bd4d82837","Type":"ContainerStarted","Data":"6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f"} Apr 22 18:46:05.986853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:05.986856 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cd89967c-2mf8r" event={"ID":"42e9a5ee-55d6-437e-9c9d-511bd4d82837","Type":"ContainerStarted","Data":"461fa0f5c4f73ba78f95eccbf003050a2de8b8d7017cc8f773e2597d5cbcf3a8"} Apr 22 18:46:05.988261 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:05.988232 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zchbz" event={"ID":"d47db21d-84ef-412a-8e2e-dd9b3fc6599b","Type":"ContainerStarted","Data":"9f22faf2eb4e7cfc9f0adaf6a23a47f9fac1ea167d62e58f2998020732267121"} Apr 22 18:46:05.988446 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:05.988410 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-zchbz" Apr 22 18:46:06.005418 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:06.005361 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-547d4f7998-lnfs2" podStartSLOduration=1.629914809 podStartE2EDuration="11.005343762s" podCreationTimestamp="2026-04-22 18:45:55 +0000 UTC" firstStartedPulling="2026-04-22 18:45:55.956407412 +0000 UTC m=+219.243843572" lastFinishedPulling="2026-04-22 18:46:05.331836354 +0000 UTC m=+228.619272525" observedRunningTime="2026-04-22 18:46:06.005009037 +0000 UTC m=+229.292445227" watchObservedRunningTime="2026-04-22 18:46:06.005343762 +0000 UTC m=+229.292779941" Apr 22 18:46:06.006945 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:06.006923 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-zchbz" Apr 22 18:46:06.023480 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:06.023423 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-zchbz" podStartSLOduration=2.020658286 podStartE2EDuration="20.023402968s" podCreationTimestamp="2026-04-22 18:45:46 +0000 UTC" firstStartedPulling="2026-04-22 18:45:47.369851467 +0000 UTC m=+210.657287622" lastFinishedPulling="2026-04-22 18:46:05.37259614 +0000 UTC m=+228.660032304" observedRunningTime="2026-04-22 18:46:06.022130449 +0000 UTC m=+229.309566627" watchObservedRunningTime="2026-04-22 18:46:06.023402968 +0000 UTC m=+229.310839146" Apr 22 18:46:06.041738 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:06.041682 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56cd89967c-2mf8r" podStartSLOduration=5.041665485 podStartE2EDuration="5.041665485s" podCreationTimestamp="2026-04-22 18:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:06.041286036 +0000 UTC m=+229.328722214" watchObservedRunningTime="2026-04-22 18:46:06.041665485 +0000 UTC m=+229.329101663" Apr 22 18:46:12.328135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:12.328095 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:12.328684 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:12.328256 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:12.333898 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:12.333871 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:13.018947 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:13.018910 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:46:13.084163 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:13.084131 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-547d4f7998-lnfs2"] Apr 22 18:46:13.738911 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:13.738882 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:46:13.743225 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:13.743201 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-95b554d55-6w22l" Apr 22 18:46:15.811101 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:15.811068 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:46:27.055726 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:27.055693 2568 generic.go:358] "Generic (PLEG): container finished" podID="baa71d25-c811-4836-a063-6c81be046499" containerID="6960bf106c479b28cc5cca5cd5aedbc6eba8d7f83f5365192555e93260b3f4ae" exitCode=0 Apr 22 18:46:27.056132 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:27.055772 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" event={"ID":"baa71d25-c811-4836-a063-6c81be046499","Type":"ContainerDied","Data":"6960bf106c479b28cc5cca5cd5aedbc6eba8d7f83f5365192555e93260b3f4ae"} Apr 22 18:46:27.056132 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:27.056086 2568 scope.go:117] "RemoveContainer" containerID="6960bf106c479b28cc5cca5cd5aedbc6eba8d7f83f5365192555e93260b3f4ae" Apr 22 18:46:28.059919 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:28.059885 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-rvjt6" event={"ID":"baa71d25-c811-4836-a063-6c81be046499","Type":"ContainerStarted","Data":"cbc1ace67e852406ad9c9f69743f2fada23b35ed1bcc9146d1e7cddd5ea27648"} Apr 22 18:46:29.048318 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:29.048282 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:46:29.050547 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:29.050527 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b4b135-c4b0-4460-84ed-684f25a4436d-metrics-certs\") pod \"network-metrics-daemon-p7h9z\" (UID: \"42b4b135-c4b0-4460-84ed-684f25a4436d\") " pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:46:29.207791 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:29.207761 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7gnxr\"" Apr 22 18:46:29.215126 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:29.215101 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p7h9z" Apr 22 18:46:29.334931 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:29.334898 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p7h9z"] Apr 22 18:46:29.338637 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:46:29.338596 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b4b135_c4b0_4460_84ed_684f25a4436d.slice/crio-c4da09957d9842b59f8b8312e2059ac6defbf23420f2b987c62fa94e67046714 WatchSource:0}: Error finding container c4da09957d9842b59f8b8312e2059ac6defbf23420f2b987c62fa94e67046714: Status 404 returned error can't find the container with id c4da09957d9842b59f8b8312e2059ac6defbf23420f2b987c62fa94e67046714 Apr 22 18:46:30.066305 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:30.066260 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p7h9z" event={"ID":"42b4b135-c4b0-4460-84ed-684f25a4436d","Type":"ContainerStarted","Data":"c4da09957d9842b59f8b8312e2059ac6defbf23420f2b987c62fa94e67046714"} Apr 22 18:46:31.070523 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:31.070457 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p7h9z" event={"ID":"42b4b135-c4b0-4460-84ed-684f25a4436d","Type":"ContainerStarted","Data":"4765b25e223b4a223e7b4707dca8be79ac7c20015b3ccb91ce3ac634c98b2589"} Apr 22 18:46:31.070523 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:31.070526 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p7h9z" event={"ID":"42b4b135-c4b0-4460-84ed-684f25a4436d","Type":"ContainerStarted","Data":"0b1f51d859f6457cbea49dbf1dcb6b4770ae33e38296fb0433fa31f242926bdd"} Apr 22 18:46:31.088486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:31.088359 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-p7h9z" podStartSLOduration=253.1757588 podStartE2EDuration="4m14.088342718s" podCreationTimestamp="2026-04-22 18:42:17 +0000 UTC" firstStartedPulling="2026-04-22 18:46:29.340398918 +0000 UTC m=+252.627835082" lastFinishedPulling="2026-04-22 18:46:30.252982839 +0000 UTC m=+253.540419000" observedRunningTime="2026-04-22 18:46:31.087316564 +0000 UTC m=+254.374752766" watchObservedRunningTime="2026-04-22 18:46:31.088342718 +0000 UTC m=+254.375778896" Apr 22 18:46:38.107537 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.107413 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-547d4f7998-lnfs2" podUID="135bb52e-eb9e-4324-b484-4d2a7afbd52b" containerName="console" containerID="cri-o://d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4" gracePeriod=15 Apr 22 18:46:38.369355 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.369333 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-547d4f7998-lnfs2_135bb52e-eb9e-4324-b484-4d2a7afbd52b/console/0.log" Apr 22 18:46:38.369466 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.369392 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:46:38.527812 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.527778 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-oauth-serving-cert\") pod \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " Apr 22 18:46:38.527962 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.527834 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cr7pk\" (UniqueName: \"kubernetes.io/projected/135bb52e-eb9e-4324-b484-4d2a7afbd52b-kube-api-access-cr7pk\") pod \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " Apr 22 18:46:38.527962 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.527854 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-config\") pod \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " Apr 22 18:46:38.527962 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.527887 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-service-ca\") pod \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " Apr 22 18:46:38.527962 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.527911 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-serving-cert\") pod \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " Apr 22 18:46:38.527962 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.527945 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-oauth-config\") pod \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\" (UID: \"135bb52e-eb9e-4324-b484-4d2a7afbd52b\") " Apr 22 18:46:38.528312 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.528275 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "135bb52e-eb9e-4324-b484-4d2a7afbd52b" (UID: "135bb52e-eb9e-4324-b484-4d2a7afbd52b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:38.528435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.528307 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-service-ca" (OuterVolumeSpecName: "service-ca") pod "135bb52e-eb9e-4324-b484-4d2a7afbd52b" (UID: "135bb52e-eb9e-4324-b484-4d2a7afbd52b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:38.528435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.528365 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-config" (OuterVolumeSpecName: "console-config") pod "135bb52e-eb9e-4324-b484-4d2a7afbd52b" (UID: "135bb52e-eb9e-4324-b484-4d2a7afbd52b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:46:38.530194 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.530159 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "135bb52e-eb9e-4324-b484-4d2a7afbd52b" (UID: "135bb52e-eb9e-4324-b484-4d2a7afbd52b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:38.530194 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.530185 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "135bb52e-eb9e-4324-b484-4d2a7afbd52b" (UID: "135bb52e-eb9e-4324-b484-4d2a7afbd52b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:46:38.530323 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.530205 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/135bb52e-eb9e-4324-b484-4d2a7afbd52b-kube-api-access-cr7pk" (OuterVolumeSpecName: "kube-api-access-cr7pk") pod "135bb52e-eb9e-4324-b484-4d2a7afbd52b" (UID: "135bb52e-eb9e-4324-b484-4d2a7afbd52b"). InnerVolumeSpecName "kube-api-access-cr7pk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:46:38.629136 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.629054 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-oauth-config\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:46:38.629136 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.629085 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-oauth-serving-cert\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:46:38.629136 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.629095 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cr7pk\" (UniqueName: \"kubernetes.io/projected/135bb52e-eb9e-4324-b484-4d2a7afbd52b-kube-api-access-cr7pk\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:46:38.629136 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.629103 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-config\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:46:38.629136 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.629113 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/135bb52e-eb9e-4324-b484-4d2a7afbd52b-service-ca\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:46:38.629136 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:38.629121 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/135bb52e-eb9e-4324-b484-4d2a7afbd52b-console-serving-cert\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:46:39.093454 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.093426 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-547d4f7998-lnfs2_135bb52e-eb9e-4324-b484-4d2a7afbd52b/console/0.log" Apr 22 18:46:39.093639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.093463 2568 generic.go:358] "Generic (PLEG): container finished" podID="135bb52e-eb9e-4324-b484-4d2a7afbd52b" containerID="d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4" exitCode=2 Apr 22 18:46:39.093639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.093531 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547d4f7998-lnfs2" event={"ID":"135bb52e-eb9e-4324-b484-4d2a7afbd52b","Type":"ContainerDied","Data":"d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4"} Apr 22 18:46:39.093639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.093567 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-547d4f7998-lnfs2" event={"ID":"135bb52e-eb9e-4324-b484-4d2a7afbd52b","Type":"ContainerDied","Data":"bd3faa5a25c63e48b3292cba19ed47c03f13a33b223a0a88f804c8b5cf8e7bf1"} Apr 22 18:46:39.093639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.093566 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-547d4f7998-lnfs2" Apr 22 18:46:39.093639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.093583 2568 scope.go:117] "RemoveContainer" containerID="d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4" Apr 22 18:46:39.101614 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.101597 2568 scope.go:117] "RemoveContainer" containerID="d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4" Apr 22 18:46:39.101835 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:46:39.101812 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4\": container with ID starting with d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4 not found: ID does not exist" containerID="d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4" Apr 22 18:46:39.101930 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.101842 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4"} err="failed to get container status \"d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4\": rpc error: code = NotFound desc = could not find container \"d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4\": container with ID starting with d2d262658fcf52682c0bd4fe95dbfdadc246c4e4077eab998b50b357121bebe4 not found: ID does not exist" Apr 22 18:46:39.116588 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.116559 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-547d4f7998-lnfs2"] Apr 22 18:46:39.121345 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.121322 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-547d4f7998-lnfs2"] Apr 22 18:46:39.307001 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:39.306967 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="135bb52e-eb9e-4324-b484-4d2a7afbd52b" path="/var/lib/kubelet/pods/135bb52e-eb9e-4324-b484-4d2a7afbd52b/volumes" Apr 22 18:46:47.148635 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.148602 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-799b5fb778-7p49l"] Apr 22 18:46:47.149121 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.148908 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="135bb52e-eb9e-4324-b484-4d2a7afbd52b" containerName="console" Apr 22 18:46:47.149121 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.148920 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="135bb52e-eb9e-4324-b484-4d2a7afbd52b" containerName="console" Apr 22 18:46:47.149121 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.148980 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="135bb52e-eb9e-4324-b484-4d2a7afbd52b" containerName="console" Apr 22 18:46:47.183424 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.183398 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799b5fb778-7p49l"] Apr 22 18:46:47.183586 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.183526 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.197888 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.197865 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9xq4\" (UniqueName: \"kubernetes.io/projected/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-kube-api-access-k9xq4\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.197989 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.197917 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-serving-cert\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.198027 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.197987 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-oauth-serving-cert\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.198027 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.198016 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-trusted-ca-bundle\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.198101 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.198047 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-oauth-config\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.198101 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.198088 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-config\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.198170 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.198120 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-service-ca\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.298561 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.298529 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-oauth-serving-cert\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.298561 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.298564 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-trusted-ca-bundle\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.298826 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.298611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-oauth-config\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.298826 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.298643 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-config\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.298826 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.298714 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-service-ca\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.298826 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.298771 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9xq4\" (UniqueName: \"kubernetes.io/projected/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-kube-api-access-k9xq4\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.299062 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.298830 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-serving-cert\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.299485 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.299336 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-oauth-serving-cert\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.299485 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.299430 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-config\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.299485 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.299472 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-service-ca\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.299830 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.299810 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-trusted-ca-bundle\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.301205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.301189 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-serving-cert\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.301259 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.301224 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-oauth-config\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.313831 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.313802 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9xq4\" (UniqueName: \"kubernetes.io/projected/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-kube-api-access-k9xq4\") pod \"console-799b5fb778-7p49l\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.492909 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.492825 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:47.612243 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:47.612219 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-799b5fb778-7p49l"] Apr 22 18:46:47.614444 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:46:47.614417 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f87480e_3ca4_4274_a2fc_92b5cb9daa00.slice/crio-5187dd93af806344309854785a70e4b25c38881437f2da837f88c79e829413ba WatchSource:0}: Error finding container 5187dd93af806344309854785a70e4b25c38881437f2da837f88c79e829413ba: Status 404 returned error can't find the container with id 5187dd93af806344309854785a70e4b25c38881437f2da837f88c79e829413ba Apr 22 18:46:48.121732 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:48.121693 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799b5fb778-7p49l" event={"ID":"2f87480e-3ca4-4274-a2fc-92b5cb9daa00","Type":"ContainerStarted","Data":"90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e"} Apr 22 18:46:48.121732 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:48.121731 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799b5fb778-7p49l" event={"ID":"2f87480e-3ca4-4274-a2fc-92b5cb9daa00","Type":"ContainerStarted","Data":"5187dd93af806344309854785a70e4b25c38881437f2da837f88c79e829413ba"} Apr 22 18:46:48.150697 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:48.150647 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-799b5fb778-7p49l" podStartSLOduration=1.150630869 podStartE2EDuration="1.150630869s" podCreationTimestamp="2026-04-22 18:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:48.149636273 +0000 UTC m=+271.437072451" watchObservedRunningTime="2026-04-22 18:46:48.150630869 +0000 UTC m=+271.438067046" Apr 22 18:46:50.997767 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:50.997737 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:46:51.021760 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.021729 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.024566 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.024539 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:46:51.027184 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.027160 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:46:51.027358 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.027338 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:46:51.027420 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.027224 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:46:51.027420 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.027350 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:46:51.027420 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.027381 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:46:51.027893 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.027872 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:46:51.029680 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.029326 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-t4m9z\"" Apr 22 18:46:51.029680 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.029361 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:46:51.031633 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.031611 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:46:51.040092 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.040061 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:46:51.132413 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132382 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132422 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6px5\" (UniqueName: \"kubernetes.io/projected/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-kube-api-access-z6px5\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132459 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-config-volume\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132496 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-config-out\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132538 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132560 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-web-config\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132584 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132604 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132915 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132651 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132915 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132668 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132915 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132683 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.132915 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.132764 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234140 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234107 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234339 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234155 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234339 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234221 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234339 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234254 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234339 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234285 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6px5\" (UniqueName: \"kubernetes.io/projected/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-kube-api-access-z6px5\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234591 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234340 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-config-volume\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234591 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234364 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234591 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234413 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-config-out\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234591 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234440 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234591 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234470 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-web-config\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234591 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234529 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234591 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234570 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.234906 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.234604 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.235086 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.235058 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.235430 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.235407 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.236206 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.236181 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.237640 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.237615 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.237909 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.237865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-config-out\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.238016 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.237943 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.238171 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.238149 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.238381 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.238359 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-web-config\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.238450 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.238395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.238612 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.238584 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-config-volume\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.239160 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.239131 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.239643 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.239623 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.243652 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.243629 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6px5\" (UniqueName: \"kubernetes.io/projected/c4b20925-76d9-43ec-9e89-1ad0ad8b85ad-kube-api-access-z6px5\") pod \"alertmanager-main-0\" (UID: \"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.333903 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.333865 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:46:51.470100 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:51.470071 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:46:51.471534 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:46:51.471509 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4b20925_76d9_43ec_9e89_1ad0ad8b85ad.slice/crio-547e98713e580ba8e3db80d5e24d7716ae6774d3fd8098bacc6b3d3add325ea3 WatchSource:0}: Error finding container 547e98713e580ba8e3db80d5e24d7716ae6774d3fd8098bacc6b3d3add325ea3: Status 404 returned error can't find the container with id 547e98713e580ba8e3db80d5e24d7716ae6774d3fd8098bacc6b3d3add325ea3 Apr 22 18:46:52.134972 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:52.134940 2568 generic.go:358] "Generic (PLEG): container finished" podID="c4b20925-76d9-43ec-9e89-1ad0ad8b85ad" containerID="811e48ee4db4f36d66d56f3c5b77eb3904a8476e98dbb576833c9a3b964b840d" exitCode=0 Apr 22 18:46:52.135357 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:52.135029 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad","Type":"ContainerDied","Data":"811e48ee4db4f36d66d56f3c5b77eb3904a8476e98dbb576833c9a3b964b840d"} Apr 22 18:46:52.135357 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:52.135067 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad","Type":"ContainerStarted","Data":"547e98713e580ba8e3db80d5e24d7716ae6774d3fd8098bacc6b3d3add325ea3"} Apr 22 18:46:54.148382 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:54.148344 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad","Type":"ContainerStarted","Data":"660311e91cd5cf96205667688ddb98464007ae70c02b8c6126d4e2001a2c3e20"} Apr 22 18:46:54.148382 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:54.148381 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad","Type":"ContainerStarted","Data":"901481fc41138814cbc9afb83e912e3aba2406b38cf672f3808ff311d221095b"} Apr 22 18:46:54.148382 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:54.148391 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad","Type":"ContainerStarted","Data":"3348ba9272ca601525f414100cc64981f544d02c97863dc4f168b459ce9a51df"} Apr 22 18:46:54.149019 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:54.148401 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad","Type":"ContainerStarted","Data":"7bf40ab2495fa476d868e5c06f4a1aae9623d3b6bbbaaf589ea4da86080a1d01"} Apr 22 18:46:54.149019 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:54.148414 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad","Type":"ContainerStarted","Data":"65efb1b861976c90cabf280f6f9fb1cd810ab28ff155bdc5cc932e7440d8a100"} Apr 22 18:46:54.149019 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:54.148423 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c4b20925-76d9-43ec-9e89-1ad0ad8b85ad","Type":"ContainerStarted","Data":"84eb75a4588385e320e8c9b94bef452da17cc49bfc7e42aebb45ccf7c2455640"} Apr 22 18:46:54.183459 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:54.183405 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.825354529 podStartE2EDuration="4.183391959s" podCreationTimestamp="2026-04-22 18:46:50 +0000 UTC" firstStartedPulling="2026-04-22 18:46:52.136169731 +0000 UTC m=+275.423605886" lastFinishedPulling="2026-04-22 18:46:53.494207161 +0000 UTC m=+276.781643316" observedRunningTime="2026-04-22 18:46:54.181251902 +0000 UTC m=+277.468688091" watchObservedRunningTime="2026-04-22 18:46:54.183391959 +0000 UTC m=+277.470828136" Apr 22 18:46:55.728495 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:46:55.728436 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-j7zw4" podUID="d0da4783-9f5c-40c3-80f0-155df59f22de" Apr 22 18:46:56.153547 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:56.153483 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j7zw4" Apr 22 18:46:57.493163 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:57.493112 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:57.493592 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:57.493210 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:57.497825 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:57.497806 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:58.162810 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:58.162779 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:46:58.229174 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:58.229146 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56cd89967c-2mf8r"] Apr 22 18:46:59.713947 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.713909 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:46:59.713947 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.713946 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:46:59.716245 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.716213 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d0da4783-9f5c-40c3-80f0-155df59f22de-metrics-tls\") pod \"dns-default-j7zw4\" (UID: \"d0da4783-9f5c-40c3-80f0-155df59f22de\") " pod="openshift-dns/dns-default-j7zw4" Apr 22 18:46:59.716388 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.716367 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b09bc722-d952-4713-bbad-524034fa2063-cert\") pod \"ingress-canary-9c2rk\" (UID: \"b09bc722-d952-4713-bbad-524034fa2063\") " pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:46:59.757244 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.757216 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5kx7g\"" Apr 22 18:46:59.765301 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.765279 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-j7zw4" Apr 22 18:46:59.806925 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.806897 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrbrj\"" Apr 22 18:46:59.814265 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.814236 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9c2rk" Apr 22 18:46:59.893203 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.893171 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-j7zw4"] Apr 22 18:46:59.897512 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:46:59.897475 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0da4783_9f5c_40c3_80f0_155df59f22de.slice/crio-7e13aa7b5667725016cd91032469d67171ec8c5168587e1c909be7a0d95d3a6b WatchSource:0}: Error finding container 7e13aa7b5667725016cd91032469d67171ec8c5168587e1c909be7a0d95d3a6b: Status 404 returned error can't find the container with id 7e13aa7b5667725016cd91032469d67171ec8c5168587e1c909be7a0d95d3a6b Apr 22 18:46:59.943682 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:46:59.943657 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9c2rk"] Apr 22 18:46:59.946091 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:46:59.946064 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb09bc722_d952_4713_bbad_524034fa2063.slice/crio-d623067c3fa78a46cfda8d445318a2d40d3130e3d8bfcec4fd251c8797c1a844 WatchSource:0}: Error finding container d623067c3fa78a46cfda8d445318a2d40d3130e3d8bfcec4fd251c8797c1a844: Status 404 returned error can't find the container with id d623067c3fa78a46cfda8d445318a2d40d3130e3d8bfcec4fd251c8797c1a844 Apr 22 18:47:00.165904 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:00.165868 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j7zw4" event={"ID":"d0da4783-9f5c-40c3-80f0-155df59f22de","Type":"ContainerStarted","Data":"7e13aa7b5667725016cd91032469d67171ec8c5168587e1c909be7a0d95d3a6b"} Apr 22 18:47:00.166887 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:00.166859 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9c2rk" event={"ID":"b09bc722-d952-4713-bbad-524034fa2063","Type":"ContainerStarted","Data":"d623067c3fa78a46cfda8d445318a2d40d3130e3d8bfcec4fd251c8797c1a844"} Apr 22 18:47:02.175637 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:02.175601 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9c2rk" event={"ID":"b09bc722-d952-4713-bbad-524034fa2063","Type":"ContainerStarted","Data":"f230098b81d5b34a53eab985bd55bc261f1e9ba7a9560d3c4fbf71bcf93beb1a"} Apr 22 18:47:02.177305 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:02.177275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j7zw4" event={"ID":"d0da4783-9f5c-40c3-80f0-155df59f22de","Type":"ContainerStarted","Data":"3252f21c3f726a0523cab438a02e33443ff0baac5645db64f37d953220a635cc"} Apr 22 18:47:02.177305 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:02.177303 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-j7zw4" event={"ID":"d0da4783-9f5c-40c3-80f0-155df59f22de","Type":"ContainerStarted","Data":"c71753ceca184f908709e016cb240f54b5cdcfcb31b855338d72224378d05027"} Apr 22 18:47:02.177496 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:02.177373 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-j7zw4" Apr 22 18:47:02.197652 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:02.197605 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9c2rk" podStartSLOduration=251.375553004 podStartE2EDuration="4m13.197591212s" podCreationTimestamp="2026-04-22 18:42:49 +0000 UTC" firstStartedPulling="2026-04-22 18:46:59.947958484 +0000 UTC m=+283.235394640" lastFinishedPulling="2026-04-22 18:47:01.769996682 +0000 UTC m=+285.057432848" observedRunningTime="2026-04-22 18:47:02.195973909 +0000 UTC m=+285.483410086" watchObservedRunningTime="2026-04-22 18:47:02.197591212 +0000 UTC m=+285.485027368" Apr 22 18:47:02.217433 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:02.217386 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-j7zw4" podStartSLOduration=251.350057004 podStartE2EDuration="4m13.217371359s" podCreationTimestamp="2026-04-22 18:42:49 +0000 UTC" firstStartedPulling="2026-04-22 18:46:59.899825499 +0000 UTC m=+283.187261656" lastFinishedPulling="2026-04-22 18:47:01.76713984 +0000 UTC m=+285.054576011" observedRunningTime="2026-04-22 18:47:02.216426031 +0000 UTC m=+285.503862208" watchObservedRunningTime="2026-04-22 18:47:02.217371359 +0000 UTC m=+285.504807586" Apr 22 18:47:12.185472 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:12.185439 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-j7zw4" Apr 22 18:47:17.212442 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:17.212419 2568 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:47:23.247854 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.247819 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56cd89967c-2mf8r" podUID="42e9a5ee-55d6-437e-9c9d-511bd4d82837" containerName="console" containerID="cri-o://6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f" gracePeriod=15 Apr 22 18:47:23.478894 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.478873 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56cd89967c-2mf8r_42e9a5ee-55d6-437e-9c9d-511bd4d82837/console/0.log" Apr 22 18:47:23.479006 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.478935 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:47:23.622679 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.622640 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-serving-cert\") pod \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " Apr 22 18:47:23.622852 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.622701 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-trusted-ca-bundle\") pod \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " Apr 22 18:47:23.622852 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.622742 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-oauth-config\") pod \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " Apr 22 18:47:23.622852 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.622779 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rqql\" (UniqueName: \"kubernetes.io/projected/42e9a5ee-55d6-437e-9c9d-511bd4d82837-kube-api-access-9rqql\") pod \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " Apr 22 18:47:23.622852 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.622811 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-service-ca\") pod \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " Apr 22 18:47:23.623108 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.622862 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-oauth-serving-cert\") pod \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " Apr 22 18:47:23.623108 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.622891 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-config\") pod \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\" (UID: \"42e9a5ee-55d6-437e-9c9d-511bd4d82837\") " Apr 22 18:47:23.623202 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.623181 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "42e9a5ee-55d6-437e-9c9d-511bd4d82837" (UID: "42e9a5ee-55d6-437e-9c9d-511bd4d82837"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:23.623475 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.623327 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "42e9a5ee-55d6-437e-9c9d-511bd4d82837" (UID: "42e9a5ee-55d6-437e-9c9d-511bd4d82837"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:23.623475 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.623422 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-service-ca" (OuterVolumeSpecName: "service-ca") pod "42e9a5ee-55d6-437e-9c9d-511bd4d82837" (UID: "42e9a5ee-55d6-437e-9c9d-511bd4d82837"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:23.623626 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.623534 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-config" (OuterVolumeSpecName: "console-config") pod "42e9a5ee-55d6-437e-9c9d-511bd4d82837" (UID: "42e9a5ee-55d6-437e-9c9d-511bd4d82837"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:47:23.624886 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.624864 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "42e9a5ee-55d6-437e-9c9d-511bd4d82837" (UID: "42e9a5ee-55d6-437e-9c9d-511bd4d82837"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:23.624970 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.624908 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "42e9a5ee-55d6-437e-9c9d-511bd4d82837" (UID: "42e9a5ee-55d6-437e-9c9d-511bd4d82837"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:47:23.624970 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.624928 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e9a5ee-55d6-437e-9c9d-511bd4d82837-kube-api-access-9rqql" (OuterVolumeSpecName: "kube-api-access-9rqql") pod "42e9a5ee-55d6-437e-9c9d-511bd4d82837" (UID: "42e9a5ee-55d6-437e-9c9d-511bd4d82837"). InnerVolumeSpecName "kube-api-access-9rqql". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:47:23.723903 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.723869 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-service-ca\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:47:23.723903 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.723899 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-oauth-serving-cert\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:47:23.723903 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.723913 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-config\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:47:23.724123 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.723923 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-serving-cert\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:47:23.724123 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.723935 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42e9a5ee-55d6-437e-9c9d-511bd4d82837-trusted-ca-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:47:23.724123 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.723944 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42e9a5ee-55d6-437e-9c9d-511bd4d82837-console-oauth-config\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:47:23.724123 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:23.723953 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9rqql\" (UniqueName: \"kubernetes.io/projected/42e9a5ee-55d6-437e-9c9d-511bd4d82837-kube-api-access-9rqql\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:47:24.249804 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.249779 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56cd89967c-2mf8r_42e9a5ee-55d6-437e-9c9d-511bd4d82837/console/0.log" Apr 22 18:47:24.250224 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.249820 2568 generic.go:358] "Generic (PLEG): container finished" podID="42e9a5ee-55d6-437e-9c9d-511bd4d82837" containerID="6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f" exitCode=2 Apr 22 18:47:24.250224 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.249849 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cd89967c-2mf8r" event={"ID":"42e9a5ee-55d6-437e-9c9d-511bd4d82837","Type":"ContainerDied","Data":"6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f"} Apr 22 18:47:24.250224 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.249886 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56cd89967c-2mf8r" event={"ID":"42e9a5ee-55d6-437e-9c9d-511bd4d82837","Type":"ContainerDied","Data":"461fa0f5c4f73ba78f95eccbf003050a2de8b8d7017cc8f773e2597d5cbcf3a8"} Apr 22 18:47:24.250224 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.249890 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56cd89967c-2mf8r" Apr 22 18:47:24.250224 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.249902 2568 scope.go:117] "RemoveContainer" containerID="6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f" Apr 22 18:47:24.257935 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.257919 2568 scope.go:117] "RemoveContainer" containerID="6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f" Apr 22 18:47:24.258180 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:47:24.258160 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f\": container with ID starting with 6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f not found: ID does not exist" containerID="6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f" Apr 22 18:47:24.258228 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.258188 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f"} err="failed to get container status \"6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f\": rpc error: code = NotFound desc = could not find container \"6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f\": container with ID starting with 6ae326e8e0ceb7662d8b83cdcbe056b63055bef73f71d5848ddf05c8878a405f not found: ID does not exist" Apr 22 18:47:24.272383 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.272359 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56cd89967c-2mf8r"] Apr 22 18:47:24.277643 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:24.277619 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56cd89967c-2mf8r"] Apr 22 18:47:25.306100 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:47:25.306066 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e9a5ee-55d6-437e-9c9d-511bd4d82837" path="/var/lib/kubelet/pods/42e9a5ee-55d6-437e-9c9d-511bd4d82837/volumes" Apr 22 18:50:49.665035 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.664998 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv"] Apr 22 18:50:49.665552 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.665335 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e9a5ee-55d6-437e-9c9d-511bd4d82837" containerName="console" Apr 22 18:50:49.665552 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.665352 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e9a5ee-55d6-437e-9c9d-511bd4d82837" containerName="console" Apr 22 18:50:49.665552 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.665431 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e9a5ee-55d6-437e-9c9d-511bd4d82837" containerName="console" Apr 22 18:50:49.667413 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.667385 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.671730 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.671710 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:50:49.671952 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.671932 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9j2jm\"" Apr 22 18:50:49.672246 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.672228 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:50:49.682156 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.682134 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv"] Apr 22 18:50:49.728256 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.728230 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.728396 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.728261 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.728396 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.728351 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzdfk\" (UniqueName: \"kubernetes.io/projected/f81093ec-7160-4dec-8c66-f920d2663bff-kube-api-access-mzdfk\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.829336 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.829283 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzdfk\" (UniqueName: \"kubernetes.io/projected/f81093ec-7160-4dec-8c66-f920d2663bff-kube-api-access-mzdfk\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.829564 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.829358 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.829564 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.829379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.829828 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.829807 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-bundle\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.829875 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.829823 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-util\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.839208 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.839186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzdfk\" (UniqueName: \"kubernetes.io/projected/f81093ec-7160-4dec-8c66-f920d2663bff-kube-api-access-mzdfk\") pod \"2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:49.976183 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:49.976101 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:50:50.100122 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:50.100104 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv"] Apr 22 18:50:50.102396 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:50:50.102368 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf81093ec_7160_4dec_8c66_f920d2663bff.slice/crio-2c65eb5fe838146e802dd206e49c76a62fbe298515a21d5c6687e276ced737de WatchSource:0}: Error finding container 2c65eb5fe838146e802dd206e49c76a62fbe298515a21d5c6687e276ced737de: Status 404 returned error can't find the container with id 2c65eb5fe838146e802dd206e49c76a62fbe298515a21d5c6687e276ced737de Apr 22 18:50:50.104260 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:50.104241 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:50:50.842849 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:50.842809 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" event={"ID":"f81093ec-7160-4dec-8c66-f920d2663bff","Type":"ContainerStarted","Data":"2c65eb5fe838146e802dd206e49c76a62fbe298515a21d5c6687e276ced737de"} Apr 22 18:50:55.860604 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:55.860570 2568 generic.go:358] "Generic (PLEG): container finished" podID="f81093ec-7160-4dec-8c66-f920d2663bff" containerID="6f7846e4839d308ad44442010dc152026ac9d024d8387f1e2381f4edb5eeec30" exitCode=0 Apr 22 18:50:55.861022 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:55.860647 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" event={"ID":"f81093ec-7160-4dec-8c66-f920d2663bff","Type":"ContainerDied","Data":"6f7846e4839d308ad44442010dc152026ac9d024d8387f1e2381f4edb5eeec30"} Apr 22 18:50:57.868562 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:57.868528 2568 generic.go:358] "Generic (PLEG): container finished" podID="f81093ec-7160-4dec-8c66-f920d2663bff" containerID="96ed95e46468a234713dd05b83134c4474a7f2f20d5f72019c51ce9ea558cd30" exitCode=0 Apr 22 18:50:57.868926 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:50:57.868606 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" event={"ID":"f81093ec-7160-4dec-8c66-f920d2663bff","Type":"ContainerDied","Data":"96ed95e46468a234713dd05b83134c4474a7f2f20d5f72019c51ce9ea558cd30"} Apr 22 18:51:03.890472 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:03.890443 2568 generic.go:358] "Generic (PLEG): container finished" podID="f81093ec-7160-4dec-8c66-f920d2663bff" containerID="532e20ce5e2df5828b4fd319dffd1048e2370bca4eda3edb5cc34dde4ea311e7" exitCode=0 Apr 22 18:51:03.890794 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:03.890548 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" event={"ID":"f81093ec-7160-4dec-8c66-f920d2663bff","Type":"ContainerDied","Data":"532e20ce5e2df5828b4fd319dffd1048e2370bca4eda3edb5cc34dde4ea311e7"} Apr 22 18:51:05.016192 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.016170 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:51:05.173244 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.173149 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-bundle\") pod \"f81093ec-7160-4dec-8c66-f920d2663bff\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " Apr 22 18:51:05.173413 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.173258 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-util\") pod \"f81093ec-7160-4dec-8c66-f920d2663bff\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " Apr 22 18:51:05.173413 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.173371 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzdfk\" (UniqueName: \"kubernetes.io/projected/f81093ec-7160-4dec-8c66-f920d2663bff-kube-api-access-mzdfk\") pod \"f81093ec-7160-4dec-8c66-f920d2663bff\" (UID: \"f81093ec-7160-4dec-8c66-f920d2663bff\") " Apr 22 18:51:05.173827 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.173805 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-bundle" (OuterVolumeSpecName: "bundle") pod "f81093ec-7160-4dec-8c66-f920d2663bff" (UID: "f81093ec-7160-4dec-8c66-f920d2663bff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:05.175477 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.175453 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f81093ec-7160-4dec-8c66-f920d2663bff-kube-api-access-mzdfk" (OuterVolumeSpecName: "kube-api-access-mzdfk") pod "f81093ec-7160-4dec-8c66-f920d2663bff" (UID: "f81093ec-7160-4dec-8c66-f920d2663bff"). InnerVolumeSpecName "kube-api-access-mzdfk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:05.178134 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.178111 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-util" (OuterVolumeSpecName: "util") pod "f81093ec-7160-4dec-8c66-f920d2663bff" (UID: "f81093ec-7160-4dec-8c66-f920d2663bff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:05.274594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.274555 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:05.274594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.274587 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mzdfk\" (UniqueName: \"kubernetes.io/projected/f81093ec-7160-4dec-8c66-f920d2663bff-kube-api-access-mzdfk\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:05.274594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.274598 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f81093ec-7160-4dec-8c66-f920d2663bff-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:05.896854 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.896825 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" Apr 22 18:51:05.896854 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.896841 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2bb52b92bd31ddf2ebbc335370ac517be50e61a93c2fe375393413c19d2msxv" event={"ID":"f81093ec-7160-4dec-8c66-f920d2663bff","Type":"ContainerDied","Data":"2c65eb5fe838146e802dd206e49c76a62fbe298515a21d5c6687e276ced737de"} Apr 22 18:51:05.897046 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:05.896870 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c65eb5fe838146e802dd206e49c76a62fbe298515a21d5c6687e276ced737de" Apr 22 18:51:12.863779 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.863701 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q"] Apr 22 18:51:12.864199 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.864149 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f81093ec-7160-4dec-8c66-f920d2663bff" containerName="extract" Apr 22 18:51:12.864199 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.864165 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81093ec-7160-4dec-8c66-f920d2663bff" containerName="extract" Apr 22 18:51:12.864199 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.864178 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f81093ec-7160-4dec-8c66-f920d2663bff" containerName="pull" Apr 22 18:51:12.864199 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.864186 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81093ec-7160-4dec-8c66-f920d2663bff" containerName="pull" Apr 22 18:51:12.864199 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.864198 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f81093ec-7160-4dec-8c66-f920d2663bff" containerName="util" Apr 22 18:51:12.864444 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.864208 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f81093ec-7160-4dec-8c66-f920d2663bff" containerName="util" Apr 22 18:51:12.864444 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.864311 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f81093ec-7160-4dec-8c66-f920d2663bff" containerName="extract" Apr 22 18:51:12.921385 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.921354 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q"] Apr 22 18:51:12.921601 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.921489 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" Apr 22 18:51:12.924211 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.924181 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 18:51:12.924211 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.924190 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:51:12.924409 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.924243 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-w7jh7\"" Apr 22 18:51:12.936233 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.936212 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2rnb\" (UniqueName: \"kubernetes.io/projected/699d4f7f-a14c-4149-b189-c577d3e91063-kube-api-access-t2rnb\") pod \"cert-manager-operator-controller-manager-54b9655956-gqj4q\" (UID: \"699d4f7f-a14c-4149-b189-c577d3e91063\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" Apr 22 18:51:12.936344 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:12.936250 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/699d4f7f-a14c-4149-b189-c577d3e91063-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-gqj4q\" (UID: \"699d4f7f-a14c-4149-b189-c577d3e91063\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" Apr 22 18:51:13.036739 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:13.036693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2rnb\" (UniqueName: \"kubernetes.io/projected/699d4f7f-a14c-4149-b189-c577d3e91063-kube-api-access-t2rnb\") pod \"cert-manager-operator-controller-manager-54b9655956-gqj4q\" (UID: \"699d4f7f-a14c-4149-b189-c577d3e91063\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" Apr 22 18:51:13.036907 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:13.036751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/699d4f7f-a14c-4149-b189-c577d3e91063-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-gqj4q\" (UID: \"699d4f7f-a14c-4149-b189-c577d3e91063\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" Apr 22 18:51:13.037089 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:13.037073 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/699d4f7f-a14c-4149-b189-c577d3e91063-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-gqj4q\" (UID: \"699d4f7f-a14c-4149-b189-c577d3e91063\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" Apr 22 18:51:13.048279 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:13.048253 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2rnb\" (UniqueName: \"kubernetes.io/projected/699d4f7f-a14c-4149-b189-c577d3e91063-kube-api-access-t2rnb\") pod \"cert-manager-operator-controller-manager-54b9655956-gqj4q\" (UID: \"699d4f7f-a14c-4149-b189-c577d3e91063\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" Apr 22 18:51:13.230679 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:13.230593 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" Apr 22 18:51:13.365845 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:13.365812 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q"] Apr 22 18:51:13.369540 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:51:13.369490 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699d4f7f_a14c_4149_b189_c577d3e91063.slice/crio-ec97cf8cea66810c9aa152819e51241ad565c31a97746940096d554277aaa0b5 WatchSource:0}: Error finding container ec97cf8cea66810c9aa152819e51241ad565c31a97746940096d554277aaa0b5: Status 404 returned error can't find the container with id ec97cf8cea66810c9aa152819e51241ad565c31a97746940096d554277aaa0b5 Apr 22 18:51:13.922359 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:13.922308 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" event={"ID":"699d4f7f-a14c-4149-b189-c577d3e91063","Type":"ContainerStarted","Data":"ec97cf8cea66810c9aa152819e51241ad565c31a97746940096d554277aaa0b5"} Apr 22 18:51:15.335842 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.335810 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6c749d644c-mnmb8"] Apr 22 18:51:15.356044 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.356012 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c749d644c-mnmb8"] Apr 22 18:51:15.356184 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.356144 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.453969 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.453941 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-trusted-ca-bundle\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.454114 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.453975 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-service-ca\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.454114 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.454050 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-oauth-serving-cert\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.454114 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.454084 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/657cc86f-ffcb-47c5-b5ad-39f5262418c9-console-serving-cert\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.454114 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.454107 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-console-config\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.454242 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.454133 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/657cc86f-ffcb-47c5-b5ad-39f5262418c9-console-oauth-config\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.454242 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.454180 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h68v2\" (UniqueName: \"kubernetes.io/projected/657cc86f-ffcb-47c5-b5ad-39f5262418c9-kube-api-access-h68v2\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.555299 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.555268 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/657cc86f-ffcb-47c5-b5ad-39f5262418c9-console-oauth-config\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.555477 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.555310 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h68v2\" (UniqueName: \"kubernetes.io/projected/657cc86f-ffcb-47c5-b5ad-39f5262418c9-kube-api-access-h68v2\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.555477 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.555339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-trusted-ca-bundle\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.555477 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.555461 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-service-ca\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.555662 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.555590 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-oauth-serving-cert\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.555662 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.555622 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/657cc86f-ffcb-47c5-b5ad-39f5262418c9-console-serving-cert\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.555757 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.555661 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-console-config\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.556161 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.556117 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-service-ca\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.556267 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.556219 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-console-config\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.556312 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.556262 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-oauth-serving-cert\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.556312 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.556301 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/657cc86f-ffcb-47c5-b5ad-39f5262418c9-trusted-ca-bundle\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.557781 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.557764 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/657cc86f-ffcb-47c5-b5ad-39f5262418c9-console-oauth-config\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.558080 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.558063 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/657cc86f-ffcb-47c5-b5ad-39f5262418c9-console-serving-cert\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.569421 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.569394 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h68v2\" (UniqueName: \"kubernetes.io/projected/657cc86f-ffcb-47c5-b5ad-39f5262418c9-kube-api-access-h68v2\") pod \"console-6c749d644c-mnmb8\" (UID: \"657cc86f-ffcb-47c5-b5ad-39f5262418c9\") " pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.666245 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.666166 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:15.886548 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.886524 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6c749d644c-mnmb8"] Apr 22 18:51:15.888556 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:51:15.888531 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod657cc86f_ffcb_47c5_b5ad_39f5262418c9.slice/crio-39addd7cb10476ed90e367f34ea139f88c94dc9eaabfe0a2abbee211a17e2d81 WatchSource:0}: Error finding container 39addd7cb10476ed90e367f34ea139f88c94dc9eaabfe0a2abbee211a17e2d81: Status 404 returned error can't find the container with id 39addd7cb10476ed90e367f34ea139f88c94dc9eaabfe0a2abbee211a17e2d81 Apr 22 18:51:15.931905 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.931873 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" event={"ID":"699d4f7f-a14c-4149-b189-c577d3e91063","Type":"ContainerStarted","Data":"edd723a8379c6f2dfa95d177e9071e5fc3e5cda8fae377bf1367f019591244b1"} Apr 22 18:51:15.933077 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:15.933050 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c749d644c-mnmb8" event={"ID":"657cc86f-ffcb-47c5-b5ad-39f5262418c9","Type":"ContainerStarted","Data":"39addd7cb10476ed90e367f34ea139f88c94dc9eaabfe0a2abbee211a17e2d81"} Apr 22 18:51:16.937663 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:16.937629 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6c749d644c-mnmb8" event={"ID":"657cc86f-ffcb-47c5-b5ad-39f5262418c9","Type":"ContainerStarted","Data":"4efefecbb6bf4236aba1dd1f817cf4401fe5fd206efaa7a4cf17aaf1f67e502f"} Apr 22 18:51:16.961596 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:16.961534 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-gqj4q" podStartSLOduration=2.567220271 podStartE2EDuration="4.961485285s" podCreationTimestamp="2026-04-22 18:51:12 +0000 UTC" firstStartedPulling="2026-04-22 18:51:13.372084973 +0000 UTC m=+536.659521133" lastFinishedPulling="2026-04-22 18:51:15.766349979 +0000 UTC m=+539.053786147" observedRunningTime="2026-04-22 18:51:16.961404393 +0000 UTC m=+540.248840573" watchObservedRunningTime="2026-04-22 18:51:16.961485285 +0000 UTC m=+540.248921469" Apr 22 18:51:16.980541 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:16.980480 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6c749d644c-mnmb8" podStartSLOduration=1.980466644 podStartE2EDuration="1.980466644s" podCreationTimestamp="2026-04-22 18:51:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:51:16.978470579 +0000 UTC m=+540.265906758" watchObservedRunningTime="2026-04-22 18:51:16.980466644 +0000 UTC m=+540.267902822" Apr 22 18:51:17.794768 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.794729 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf"] Apr 22 18:51:17.818306 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.818279 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf"] Apr 22 18:51:17.818451 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.818401 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:17.821468 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.821369 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:51:17.821468 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.821403 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9j2jm\"" Apr 22 18:51:17.821847 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.821823 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:51:17.872047 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.872014 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:17.872202 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.872095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:17.872202 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.872125 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn8jw\" (UniqueName: \"kubernetes.io/projected/2dcc86a4-83cf-4abd-a360-6246108d78d2-kube-api-access-qn8jw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:17.972720 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.972682 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:17.973126 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.972730 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qn8jw\" (UniqueName: \"kubernetes.io/projected/2dcc86a4-83cf-4abd-a360-6246108d78d2-kube-api-access-qn8jw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:17.973126 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.972793 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:17.973126 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.973073 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:17.973354 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.973129 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:17.985294 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:17.985266 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn8jw\" (UniqueName: \"kubernetes.io/projected/2dcc86a4-83cf-4abd-a360-6246108d78d2-kube-api-access-qn8jw\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:18.051400 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.051370 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-gswtb"] Apr 22 18:51:18.071569 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.071544 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-gswtb"] Apr 22 18:51:18.071711 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.071570 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:18.073485 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.073463 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mfcz\" (UniqueName: \"kubernetes.io/projected/1847e710-1c28-449a-a860-79aa86be716c-kube-api-access-9mfcz\") pod \"cert-manager-webhook-587ccfb98-gswtb\" (UID: \"1847e710-1c28-449a-a860-79aa86be716c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:18.073597 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.073492 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1847e710-1c28-449a-a860-79aa86be716c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-gswtb\" (UID: \"1847e710-1c28-449a-a860-79aa86be716c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:18.074180 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.074162 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-thp8h\"" Apr 22 18:51:18.074408 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.074388 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 18:51:18.074669 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.074652 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 18:51:18.128422 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.128386 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:18.174815 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.174776 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mfcz\" (UniqueName: \"kubernetes.io/projected/1847e710-1c28-449a-a860-79aa86be716c-kube-api-access-9mfcz\") pod \"cert-manager-webhook-587ccfb98-gswtb\" (UID: \"1847e710-1c28-449a-a860-79aa86be716c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:18.174976 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.174823 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1847e710-1c28-449a-a860-79aa86be716c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-gswtb\" (UID: \"1847e710-1c28-449a-a860-79aa86be716c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:18.184991 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.184962 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1847e710-1c28-449a-a860-79aa86be716c-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-gswtb\" (UID: \"1847e710-1c28-449a-a860-79aa86be716c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:18.185455 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.185437 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mfcz\" (UniqueName: \"kubernetes.io/projected/1847e710-1c28-449a-a860-79aa86be716c-kube-api-access-9mfcz\") pod \"cert-manager-webhook-587ccfb98-gswtb\" (UID: \"1847e710-1c28-449a-a860-79aa86be716c\") " pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:18.251087 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.251062 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf"] Apr 22 18:51:18.252653 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:51:18.252630 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dcc86a4_83cf_4abd_a360_6246108d78d2.slice/crio-834de7b0df43421182d829ee5d2b091f0c3c686f82b8363e93fb19641f5de962 WatchSource:0}: Error finding container 834de7b0df43421182d829ee5d2b091f0c3c686f82b8363e93fb19641f5de962: Status 404 returned error can't find the container with id 834de7b0df43421182d829ee5d2b091f0c3c686f82b8363e93fb19641f5de962 Apr 22 18:51:18.390691 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.390657 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:18.521024 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.521001 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-gswtb"] Apr 22 18:51:18.523061 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:51:18.523033 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1847e710_1c28_449a_a860_79aa86be716c.slice/crio-7ec1f38e442dfcd63d7c45a6d14000bac818b27479f31a7d4f5df3908c9fb656 WatchSource:0}: Error finding container 7ec1f38e442dfcd63d7c45a6d14000bac818b27479f31a7d4f5df3908c9fb656: Status 404 returned error can't find the container with id 7ec1f38e442dfcd63d7c45a6d14000bac818b27479f31a7d4f5df3908c9fb656 Apr 22 18:51:18.947997 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.947960 2568 generic.go:358] "Generic (PLEG): container finished" podID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerID="83686012a200c28cecee3954d85240c99410c0d6ff897203562d0058d3c2fa69" exitCode=0 Apr 22 18:51:18.948181 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.948047 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" event={"ID":"2dcc86a4-83cf-4abd-a360-6246108d78d2","Type":"ContainerDied","Data":"83686012a200c28cecee3954d85240c99410c0d6ff897203562d0058d3c2fa69"} Apr 22 18:51:18.948181 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.948080 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" event={"ID":"2dcc86a4-83cf-4abd-a360-6246108d78d2","Type":"ContainerStarted","Data":"834de7b0df43421182d829ee5d2b091f0c3c686f82b8363e93fb19641f5de962"} Apr 22 18:51:18.949088 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:18.949065 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" event={"ID":"1847e710-1c28-449a-a860-79aa86be716c","Type":"ContainerStarted","Data":"7ec1f38e442dfcd63d7c45a6d14000bac818b27479f31a7d4f5df3908c9fb656"} Apr 22 18:51:21.963123 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:21.963086 2568 generic.go:358] "Generic (PLEG): container finished" podID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerID="d533e75b8ef2dd33e957982e70b5b63c54800a40dbf8804c9c1f39c46f6cea64" exitCode=0 Apr 22 18:51:21.963569 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:21.963178 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" event={"ID":"2dcc86a4-83cf-4abd-a360-6246108d78d2","Type":"ContainerDied","Data":"d533e75b8ef2dd33e957982e70b5b63c54800a40dbf8804c9c1f39c46f6cea64"} Apr 22 18:51:21.964600 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:21.964575 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" event={"ID":"1847e710-1c28-449a-a860-79aa86be716c","Type":"ContainerStarted","Data":"8e55ebf75b94d54730f12d8abbf428ab53b44c8db71969ed54570fc6584a94bc"} Apr 22 18:51:21.964716 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:21.964702 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:22.021138 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:22.021096 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" podStartSLOduration=0.760089896 podStartE2EDuration="4.021081462s" podCreationTimestamp="2026-04-22 18:51:18 +0000 UTC" firstStartedPulling="2026-04-22 18:51:18.524851372 +0000 UTC m=+541.812287527" lastFinishedPulling="2026-04-22 18:51:21.785842931 +0000 UTC m=+545.073279093" observedRunningTime="2026-04-22 18:51:22.01957113 +0000 UTC m=+545.307007299" watchObservedRunningTime="2026-04-22 18:51:22.021081462 +0000 UTC m=+545.308517640" Apr 22 18:51:22.970095 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:22.970057 2568 generic.go:358] "Generic (PLEG): container finished" podID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerID="41b28b9ede86a12ab4ea803061a0d92586ddf7c6f6d5f4ab4a16b344d07296c5" exitCode=0 Apr 22 18:51:22.970461 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:22.970175 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" event={"ID":"2dcc86a4-83cf-4abd-a360-6246108d78d2","Type":"ContainerDied","Data":"41b28b9ede86a12ab4ea803061a0d92586ddf7c6f6d5f4ab4a16b344d07296c5"} Apr 22 18:51:24.092328 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.092306 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:24.133624 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.133592 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-util\") pod \"2dcc86a4-83cf-4abd-a360-6246108d78d2\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " Apr 22 18:51:24.133787 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.133638 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-bundle\") pod \"2dcc86a4-83cf-4abd-a360-6246108d78d2\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " Apr 22 18:51:24.133787 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.133745 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn8jw\" (UniqueName: \"kubernetes.io/projected/2dcc86a4-83cf-4abd-a360-6246108d78d2-kube-api-access-qn8jw\") pod \"2dcc86a4-83cf-4abd-a360-6246108d78d2\" (UID: \"2dcc86a4-83cf-4abd-a360-6246108d78d2\") " Apr 22 18:51:24.134055 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.134021 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-bundle" (OuterVolumeSpecName: "bundle") pod "2dcc86a4-83cf-4abd-a360-6246108d78d2" (UID: "2dcc86a4-83cf-4abd-a360-6246108d78d2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:24.135746 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.135723 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dcc86a4-83cf-4abd-a360-6246108d78d2-kube-api-access-qn8jw" (OuterVolumeSpecName: "kube-api-access-qn8jw") pod "2dcc86a4-83cf-4abd-a360-6246108d78d2" (UID: "2dcc86a4-83cf-4abd-a360-6246108d78d2"). InnerVolumeSpecName "kube-api-access-qn8jw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:24.138468 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.138434 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-util" (OuterVolumeSpecName: "util") pod "2dcc86a4-83cf-4abd-a360-6246108d78d2" (UID: "2dcc86a4-83cf-4abd-a360-6246108d78d2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:24.234453 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.234369 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:24.234453 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.234399 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dcc86a4-83cf-4abd-a360-6246108d78d2-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:24.234453 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.234411 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qn8jw\" (UniqueName: \"kubernetes.io/projected/2dcc86a4-83cf-4abd-a360-6246108d78d2-kube-api-access-qn8jw\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:24.979915 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.979878 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" event={"ID":"2dcc86a4-83cf-4abd-a360-6246108d78d2","Type":"ContainerDied","Data":"834de7b0df43421182d829ee5d2b091f0c3c686f82b8363e93fb19641f5de962"} Apr 22 18:51:24.979915 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.979919 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="834de7b0df43421182d829ee5d2b091f0c3c686f82b8363e93fb19641f5de962" Apr 22 18:51:24.980108 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:24.979890 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87f7jnwf" Apr 22 18:51:25.667045 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:25.667012 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:25.667411 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:25.667061 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:25.671641 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:25.671617 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:25.987860 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:25.987784 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6c749d644c-mnmb8" Apr 22 18:51:26.036869 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:26.036841 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-799b5fb778-7p49l"] Apr 22 18:51:27.972989 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:27.972958 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-gswtb" Apr 22 18:51:29.435066 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.435035 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-w2kjr"] Apr 22 18:51:29.435435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.435358 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerName="extract" Apr 22 18:51:29.435435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.435369 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerName="extract" Apr 22 18:51:29.435435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.435380 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerName="pull" Apr 22 18:51:29.435435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.435385 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerName="pull" Apr 22 18:51:29.435435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.435411 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerName="util" Apr 22 18:51:29.435435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.435417 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerName="util" Apr 22 18:51:29.435651 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.435466 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2dcc86a4-83cf-4abd-a360-6246108d78d2" containerName="extract" Apr 22 18:51:29.440126 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.440109 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-w2kjr" Apr 22 18:51:29.442568 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.442548 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-7djtb\"" Apr 22 18:51:29.448330 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.448309 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-w2kjr"] Apr 22 18:51:29.483329 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.483296 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7bn\" (UniqueName: \"kubernetes.io/projected/eb8a330b-fd27-4e86-89a6-0ca478fb9607-kube-api-access-gt7bn\") pod \"cert-manager-79c8d999ff-w2kjr\" (UID: \"eb8a330b-fd27-4e86-89a6-0ca478fb9607\") " pod="cert-manager/cert-manager-79c8d999ff-w2kjr" Apr 22 18:51:29.483482 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.483332 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb8a330b-fd27-4e86-89a6-0ca478fb9607-bound-sa-token\") pod \"cert-manager-79c8d999ff-w2kjr\" (UID: \"eb8a330b-fd27-4e86-89a6-0ca478fb9607\") " pod="cert-manager/cert-manager-79c8d999ff-w2kjr" Apr 22 18:51:29.584490 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.584451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7bn\" (UniqueName: \"kubernetes.io/projected/eb8a330b-fd27-4e86-89a6-0ca478fb9607-kube-api-access-gt7bn\") pod \"cert-manager-79c8d999ff-w2kjr\" (UID: \"eb8a330b-fd27-4e86-89a6-0ca478fb9607\") " pod="cert-manager/cert-manager-79c8d999ff-w2kjr" Apr 22 18:51:29.584490 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.584494 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb8a330b-fd27-4e86-89a6-0ca478fb9607-bound-sa-token\") pod \"cert-manager-79c8d999ff-w2kjr\" (UID: \"eb8a330b-fd27-4e86-89a6-0ca478fb9607\") " pod="cert-manager/cert-manager-79c8d999ff-w2kjr" Apr 22 18:51:29.593973 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.593950 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb8a330b-fd27-4e86-89a6-0ca478fb9607-bound-sa-token\") pod \"cert-manager-79c8d999ff-w2kjr\" (UID: \"eb8a330b-fd27-4e86-89a6-0ca478fb9607\") " pod="cert-manager/cert-manager-79c8d999ff-w2kjr" Apr 22 18:51:29.595430 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.595399 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7bn\" (UniqueName: \"kubernetes.io/projected/eb8a330b-fd27-4e86-89a6-0ca478fb9607-kube-api-access-gt7bn\") pod \"cert-manager-79c8d999ff-w2kjr\" (UID: \"eb8a330b-fd27-4e86-89a6-0ca478fb9607\") " pod="cert-manager/cert-manager-79c8d999ff-w2kjr" Apr 22 18:51:29.749366 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.749273 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-w2kjr" Apr 22 18:51:29.875097 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.875071 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-w2kjr"] Apr 22 18:51:29.876996 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:51:29.876968 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb8a330b_fd27_4e86_89a6_0ca478fb9607.slice/crio-a03596e4abe931f8552550d73b4d30944333409026ce2bb9e6a3310b8b525412 WatchSource:0}: Error finding container a03596e4abe931f8552550d73b4d30944333409026ce2bb9e6a3310b8b525412: Status 404 returned error can't find the container with id a03596e4abe931f8552550d73b4d30944333409026ce2bb9e6a3310b8b525412 Apr 22 18:51:29.997023 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.996987 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-w2kjr" event={"ID":"eb8a330b-fd27-4e86-89a6-0ca478fb9607","Type":"ContainerStarted","Data":"d6921b9e36aec5d5909cbbe70d6d41efa59413fbf29bbaec193c3f9d4315c35f"} Apr 22 18:51:29.997023 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:29.997022 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-w2kjr" event={"ID":"eb8a330b-fd27-4e86-89a6-0ca478fb9607","Type":"ContainerStarted","Data":"a03596e4abe931f8552550d73b4d30944333409026ce2bb9e6a3310b8b525412"} Apr 22 18:51:30.024268 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:30.024169 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-w2kjr" podStartSLOduration=1.024154026 podStartE2EDuration="1.024154026s" podCreationTimestamp="2026-04-22 18:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:51:30.022203046 +0000 UTC m=+553.309639237" watchObservedRunningTime="2026-04-22 18:51:30.024154026 +0000 UTC m=+553.311590201" Apr 22 18:51:37.474006 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.473969 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg"] Apr 22 18:51:37.477707 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.477686 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.485473 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.485453 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:51:37.485913 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.485899 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:51:37.485960 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.485901 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9j2jm\"" Apr 22 18:51:37.494652 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.494631 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg"] Apr 22 18:51:37.553556 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.553521 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.553716 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.553615 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.553716 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.553684 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zcnv\" (UniqueName: \"kubernetes.io/projected/9f3b0629-c63c-486f-ac24-18b4b6e920d9-kube-api-access-2zcnv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.654733 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.654698 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zcnv\" (UniqueName: \"kubernetes.io/projected/9f3b0629-c63c-486f-ac24-18b4b6e920d9-kube-api-access-2zcnv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.654880 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.654769 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.654880 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.654821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.655151 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.655131 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.655202 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.655185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.665096 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.665070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zcnv\" (UniqueName: \"kubernetes.io/projected/9f3b0629-c63c-486f-ac24-18b4b6e920d9-kube-api-access-2zcnv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.787472 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.787371 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:37.913413 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:37.913295 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg"] Apr 22 18:51:37.916393 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:51:37.916366 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3b0629_c63c_486f_ac24_18b4b6e920d9.slice/crio-a18fc8f0daa9efd8c1f66c78db2bd6c5e871357de274215aa64dd04b10679073 WatchSource:0}: Error finding container a18fc8f0daa9efd8c1f66c78db2bd6c5e871357de274215aa64dd04b10679073: Status 404 returned error can't find the container with id a18fc8f0daa9efd8c1f66c78db2bd6c5e871357de274215aa64dd04b10679073 Apr 22 18:51:38.028158 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:38.028125 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerID="d676516f3fb1a52fd755dc2c45fc16e519858d8208594cea473c4719895bec64" exitCode=0 Apr 22 18:51:38.028299 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:38.028184 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" event={"ID":"9f3b0629-c63c-486f-ac24-18b4b6e920d9","Type":"ContainerDied","Data":"d676516f3fb1a52fd755dc2c45fc16e519858d8208594cea473c4719895bec64"} Apr 22 18:51:38.028299 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:38.028212 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" event={"ID":"9f3b0629-c63c-486f-ac24-18b4b6e920d9","Type":"ContainerStarted","Data":"a18fc8f0daa9efd8c1f66c78db2bd6c5e871357de274215aa64dd04b10679073"} Apr 22 18:51:39.033628 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:39.033546 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerID="0c8a0a1ea01f5e99d10036c2374d6580a3307b149e320d2f922e81f5fd020c9a" exitCode=0 Apr 22 18:51:39.033628 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:39.033617 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" event={"ID":"9f3b0629-c63c-486f-ac24-18b4b6e920d9","Type":"ContainerDied","Data":"0c8a0a1ea01f5e99d10036c2374d6580a3307b149e320d2f922e81f5fd020c9a"} Apr 22 18:51:40.038803 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:40.038765 2568 generic.go:358] "Generic (PLEG): container finished" podID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerID="1c6894c537b15b361d26c2e253122d321f4b1c4fc4192ec4fd581ec6aea621d5" exitCode=0 Apr 22 18:51:40.039162 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:40.038842 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" event={"ID":"9f3b0629-c63c-486f-ac24-18b4b6e920d9","Type":"ContainerDied","Data":"1c6894c537b15b361d26c2e253122d321f4b1c4fc4192ec4fd581ec6aea621d5"} Apr 22 18:51:41.160451 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.160427 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:41.286443 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.286414 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-bundle\") pod \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " Apr 22 18:51:41.286599 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.286520 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-util\") pod \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " Apr 22 18:51:41.286599 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.286558 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zcnv\" (UniqueName: \"kubernetes.io/projected/9f3b0629-c63c-486f-ac24-18b4b6e920d9-kube-api-access-2zcnv\") pod \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\" (UID: \"9f3b0629-c63c-486f-ac24-18b4b6e920d9\") " Apr 22 18:51:41.287138 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.287107 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-bundle" (OuterVolumeSpecName: "bundle") pod "9f3b0629-c63c-486f-ac24-18b4b6e920d9" (UID: "9f3b0629-c63c-486f-ac24-18b4b6e920d9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:41.288512 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.288468 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3b0629-c63c-486f-ac24-18b4b6e920d9-kube-api-access-2zcnv" (OuterVolumeSpecName: "kube-api-access-2zcnv") pod "9f3b0629-c63c-486f-ac24-18b4b6e920d9" (UID: "9f3b0629-c63c-486f-ac24-18b4b6e920d9"). InnerVolumeSpecName "kube-api-access-2zcnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:41.294700 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.294639 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-util" (OuterVolumeSpecName: "util") pod "9f3b0629-c63c-486f-ac24-18b4b6e920d9" (UID: "9f3b0629-c63c-486f-ac24-18b4b6e920d9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:41.388320 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.388278 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:41.388320 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.388318 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f3b0629-c63c-486f-ac24-18b4b6e920d9-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:41.388555 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:41.388332 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2zcnv\" (UniqueName: \"kubernetes.io/projected/9f3b0629-c63c-486f-ac24-18b4b6e920d9-kube-api-access-2zcnv\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:42.046753 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:42.046716 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" event={"ID":"9f3b0629-c63c-486f-ac24-18b4b6e920d9","Type":"ContainerDied","Data":"a18fc8f0daa9efd8c1f66c78db2bd6c5e871357de274215aa64dd04b10679073"} Apr 22 18:51:42.046753 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:42.046756 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a18fc8f0daa9efd8c1f66c78db2bd6c5e871357de274215aa64dd04b10679073" Apr 22 18:51:42.046955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:42.046767 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5kqshg" Apr 22 18:51:48.358007 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.357972 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf"] Apr 22 18:51:48.358381 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.358323 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerName="extract" Apr 22 18:51:48.358381 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.358335 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerName="extract" Apr 22 18:51:48.358381 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.358356 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerName="pull" Apr 22 18:51:48.358381 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.358361 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerName="pull" Apr 22 18:51:48.358381 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.358373 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerName="util" Apr 22 18:51:48.358381 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.358378 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerName="util" Apr 22 18:51:48.358594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.358434 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f3b0629-c63c-486f-ac24-18b4b6e920d9" containerName="extract" Apr 22 18:51:48.362742 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.362723 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.365340 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.365317 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:51:48.365443 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.365374 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:51:48.366213 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.366199 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9j2jm\"" Apr 22 18:51:48.377817 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.377794 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf"] Apr 22 18:51:48.451221 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.451188 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhfwr\" (UniqueName: \"kubernetes.io/projected/20332822-244d-4511-985d-335f4aee2b1e-kube-api-access-fhfwr\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.451372 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.451228 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.451372 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.451260 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.551857 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.551816 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhfwr\" (UniqueName: \"kubernetes.io/projected/20332822-244d-4511-985d-335f4aee2b1e-kube-api-access-fhfwr\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.552020 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.551881 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.552020 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.551937 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.552262 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.552245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.552300 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.552280 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.561626 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.561594 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhfwr\" (UniqueName: \"kubernetes.io/projected/20332822-244d-4511-985d-335f4aee2b1e-kube-api-access-fhfwr\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.672095 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.672011 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:48.813348 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:48.811389 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf"] Apr 22 18:51:48.817637 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:51:48.817607 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20332822_244d_4511_985d_335f4aee2b1e.slice/crio-c1ff6c92406ee2982ced7515b1c7a25e21e92cf34b836771bcabfd407b46801e WatchSource:0}: Error finding container c1ff6c92406ee2982ced7515b1c7a25e21e92cf34b836771bcabfd407b46801e: Status 404 returned error can't find the container with id c1ff6c92406ee2982ced7515b1c7a25e21e92cf34b836771bcabfd407b46801e Apr 22 18:51:49.071554 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:49.071493 2568 generic.go:358] "Generic (PLEG): container finished" podID="20332822-244d-4511-985d-335f4aee2b1e" containerID="b144be46c284a5deb1a553e9a7a7ab10d1d18f15b7d37a5f107034dd33aff484" exitCode=0 Apr 22 18:51:49.071730 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:49.071595 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" event={"ID":"20332822-244d-4511-985d-335f4aee2b1e","Type":"ContainerDied","Data":"b144be46c284a5deb1a553e9a7a7ab10d1d18f15b7d37a5f107034dd33aff484"} Apr 22 18:51:49.071730 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:49.071637 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" event={"ID":"20332822-244d-4511-985d-335f4aee2b1e","Type":"ContainerStarted","Data":"c1ff6c92406ee2982ced7515b1c7a25e21e92cf34b836771bcabfd407b46801e"} Apr 22 18:51:50.076663 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.076633 2568 generic.go:358] "Generic (PLEG): container finished" podID="20332822-244d-4511-985d-335f4aee2b1e" containerID="838d7603056ae639bed3ca7841ec56f5b1ad3ae20836b69d190a759348d939ea" exitCode=0 Apr 22 18:51:50.077062 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.076692 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" event={"ID":"20332822-244d-4511-985d-335f4aee2b1e","Type":"ContainerDied","Data":"838d7603056ae639bed3ca7841ec56f5b1ad3ae20836b69d190a759348d939ea"} Apr 22 18:51:50.679329 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.679296 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2"] Apr 22 18:51:50.682719 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.682696 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.685275 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.685245 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 18:51:50.685425 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.685334 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 18:51:50.685425 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.685336 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 18:51:50.685425 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.685412 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 18:51:50.685580 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.685337 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-srzjc\"" Apr 22 18:51:50.702295 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.702270 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2"] Apr 22 18:51:50.774955 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.774923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f72cd4e-5f6c-4405-a2ba-2700077a5303-apiservice-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-6szw2\" (UID: \"4f72cd4e-5f6c-4405-a2ba-2700077a5303\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.775110 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.774967 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvl8j\" (UniqueName: \"kubernetes.io/projected/4f72cd4e-5f6c-4405-a2ba-2700077a5303-kube-api-access-mvl8j\") pod \"opendatahub-operator-controller-manager-dd89cc56c-6szw2\" (UID: \"4f72cd4e-5f6c-4405-a2ba-2700077a5303\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.775110 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.775002 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f72cd4e-5f6c-4405-a2ba-2700077a5303-webhook-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-6szw2\" (UID: \"4f72cd4e-5f6c-4405-a2ba-2700077a5303\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.875625 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.875575 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f72cd4e-5f6c-4405-a2ba-2700077a5303-apiservice-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-6szw2\" (UID: \"4f72cd4e-5f6c-4405-a2ba-2700077a5303\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.875815 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.875634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvl8j\" (UniqueName: \"kubernetes.io/projected/4f72cd4e-5f6c-4405-a2ba-2700077a5303-kube-api-access-mvl8j\") pod \"opendatahub-operator-controller-manager-dd89cc56c-6szw2\" (UID: \"4f72cd4e-5f6c-4405-a2ba-2700077a5303\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.875815 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.875669 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f72cd4e-5f6c-4405-a2ba-2700077a5303-webhook-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-6szw2\" (UID: \"4f72cd4e-5f6c-4405-a2ba-2700077a5303\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.878205 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.878173 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4f72cd4e-5f6c-4405-a2ba-2700077a5303-apiservice-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-6szw2\" (UID: \"4f72cd4e-5f6c-4405-a2ba-2700077a5303\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.878575 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.878551 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4f72cd4e-5f6c-4405-a2ba-2700077a5303-webhook-cert\") pod \"opendatahub-operator-controller-manager-dd89cc56c-6szw2\" (UID: \"4f72cd4e-5f6c-4405-a2ba-2700077a5303\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.884857 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.884834 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvl8j\" (UniqueName: \"kubernetes.io/projected/4f72cd4e-5f6c-4405-a2ba-2700077a5303-kube-api-access-mvl8j\") pod \"opendatahub-operator-controller-manager-dd89cc56c-6szw2\" (UID: \"4f72cd4e-5f6c-4405-a2ba-2700077a5303\") " pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:50.993722 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:50.993622 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:51.056069 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.056011 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-799b5fb778-7p49l" podUID="2f87480e-3ca4-4274-a2fc-92b5cb9daa00" containerName="console" containerID="cri-o://90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e" gracePeriod=15 Apr 22 18:51:51.083562 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.083531 2568 generic.go:358] "Generic (PLEG): container finished" podID="20332822-244d-4511-985d-335f4aee2b1e" containerID="893c6a9e5b1f4173a06fbc4d347f3979fe9d7aa536da7d6ad5c9196232023ebc" exitCode=0 Apr 22 18:51:51.083853 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.083610 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" event={"ID":"20332822-244d-4511-985d-335f4aee2b1e","Type":"ContainerDied","Data":"893c6a9e5b1f4173a06fbc4d347f3979fe9d7aa536da7d6ad5c9196232023ebc"} Apr 22 18:51:51.123753 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.123725 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2"] Apr 22 18:51:51.127243 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:51:51.127215 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f72cd4e_5f6c_4405_a2ba_2700077a5303.slice/crio-5c135f2dd1990ecfc948f29ce8622659a8008f96a2a12358c7e01d591fdb99ba WatchSource:0}: Error finding container 5c135f2dd1990ecfc948f29ce8622659a8008f96a2a12358c7e01d591fdb99ba: Status 404 returned error can't find the container with id 5c135f2dd1990ecfc948f29ce8622659a8008f96a2a12358c7e01d591fdb99ba Apr 22 18:51:51.282964 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.282940 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-799b5fb778-7p49l_2f87480e-3ca4-4274-a2fc-92b5cb9daa00/console/0.log" Apr 22 18:51:51.283092 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.283002 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:51:51.380779 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.380745 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-config\") pod \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " Apr 22 18:51:51.380946 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.380811 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-serving-cert\") pod \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " Apr 22 18:51:51.380946 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.380837 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9xq4\" (UniqueName: \"kubernetes.io/projected/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-kube-api-access-k9xq4\") pod \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " Apr 22 18:51:51.380946 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.380852 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-oauth-serving-cert\") pod \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " Apr 22 18:51:51.380946 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.380872 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-oauth-config\") pod \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " Apr 22 18:51:51.380946 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.380907 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-trusted-ca-bundle\") pod \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " Apr 22 18:51:51.381174 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.380974 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-service-ca\") pod \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\" (UID: \"2f87480e-3ca4-4274-a2fc-92b5cb9daa00\") " Apr 22 18:51:51.381228 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.381160 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-config" (OuterVolumeSpecName: "console-config") pod "2f87480e-3ca4-4274-a2fc-92b5cb9daa00" (UID: "2f87480e-3ca4-4274-a2fc-92b5cb9daa00"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:51.381435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.381400 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-service-ca" (OuterVolumeSpecName: "service-ca") pod "2f87480e-3ca4-4274-a2fc-92b5cb9daa00" (UID: "2f87480e-3ca4-4274-a2fc-92b5cb9daa00"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:51.381435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.381416 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2f87480e-3ca4-4274-a2fc-92b5cb9daa00" (UID: "2f87480e-3ca4-4274-a2fc-92b5cb9daa00"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:51.381666 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.381486 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2f87480e-3ca4-4274-a2fc-92b5cb9daa00" (UID: "2f87480e-3ca4-4274-a2fc-92b5cb9daa00"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:51:51.383162 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.383139 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2f87480e-3ca4-4274-a2fc-92b5cb9daa00" (UID: "2f87480e-3ca4-4274-a2fc-92b5cb9daa00"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:51.383253 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.383169 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-kube-api-access-k9xq4" (OuterVolumeSpecName: "kube-api-access-k9xq4") pod "2f87480e-3ca4-4274-a2fc-92b5cb9daa00" (UID: "2f87480e-3ca4-4274-a2fc-92b5cb9daa00"). InnerVolumeSpecName "kube-api-access-k9xq4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:51.383253 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.383174 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2f87480e-3ca4-4274-a2fc-92b5cb9daa00" (UID: "2f87480e-3ca4-4274-a2fc-92b5cb9daa00"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:51:51.481673 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.481645 2568 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-service-ca\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:51.481673 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.481670 2568 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-config\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:51.481673 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.481681 2568 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-serving-cert\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:51.481901 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.481692 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k9xq4\" (UniqueName: \"kubernetes.io/projected/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-kube-api-access-k9xq4\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:51.481901 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.481701 2568 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-oauth-serving-cert\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:51.481901 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.481710 2568 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-console-oauth-config\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:51.481901 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:51.481719 2568 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f87480e-3ca4-4274-a2fc-92b5cb9daa00-trusted-ca-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:52.091292 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.091253 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" event={"ID":"4f72cd4e-5f6c-4405-a2ba-2700077a5303","Type":"ContainerStarted","Data":"5c135f2dd1990ecfc948f29ce8622659a8008f96a2a12358c7e01d591fdb99ba"} Apr 22 18:51:52.092895 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.092870 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-799b5fb778-7p49l_2f87480e-3ca4-4274-a2fc-92b5cb9daa00/console/0.log" Apr 22 18:51:52.093017 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.092910 2568 generic.go:358] "Generic (PLEG): container finished" podID="2f87480e-3ca4-4274-a2fc-92b5cb9daa00" containerID="90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e" exitCode=2 Apr 22 18:51:52.093017 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.092995 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-799b5fb778-7p49l" Apr 22 18:51:52.093143 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.093029 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799b5fb778-7p49l" event={"ID":"2f87480e-3ca4-4274-a2fc-92b5cb9daa00","Type":"ContainerDied","Data":"90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e"} Apr 22 18:51:52.093143 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.093057 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-799b5fb778-7p49l" event={"ID":"2f87480e-3ca4-4274-a2fc-92b5cb9daa00","Type":"ContainerDied","Data":"5187dd93af806344309854785a70e4b25c38881437f2da837f88c79e829413ba"} Apr 22 18:51:52.093143 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.093078 2568 scope.go:117] "RemoveContainer" containerID="90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e" Apr 22 18:51:52.110611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.110468 2568 scope.go:117] "RemoveContainer" containerID="90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e" Apr 22 18:51:52.110964 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:51:52.110938 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e\": container with ID starting with 90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e not found: ID does not exist" containerID="90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e" Apr 22 18:51:52.111176 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.111127 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e"} err="failed to get container status \"90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e\": rpc error: code = NotFound desc = could not find container \"90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e\": container with ID starting with 90c5fbefaf9eb6a9d7e05a55a60174ec6204f424ba80f5022f72c0e19a5b8a7e not found: ID does not exist" Apr 22 18:51:52.135294 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.135253 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-799b5fb778-7p49l"] Apr 22 18:51:52.146355 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.146261 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-799b5fb778-7p49l"] Apr 22 18:51:52.256933 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.256908 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:52.391057 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.391028 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-bundle\") pod \"20332822-244d-4511-985d-335f4aee2b1e\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " Apr 22 18:51:52.391252 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.391137 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-util\") pod \"20332822-244d-4511-985d-335f4aee2b1e\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " Apr 22 18:51:52.391252 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.391193 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhfwr\" (UniqueName: \"kubernetes.io/projected/20332822-244d-4511-985d-335f4aee2b1e-kube-api-access-fhfwr\") pod \"20332822-244d-4511-985d-335f4aee2b1e\" (UID: \"20332822-244d-4511-985d-335f4aee2b1e\") " Apr 22 18:51:52.392020 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.391976 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-bundle" (OuterVolumeSpecName: "bundle") pod "20332822-244d-4511-985d-335f4aee2b1e" (UID: "20332822-244d-4511-985d-335f4aee2b1e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:52.394139 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.394104 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20332822-244d-4511-985d-335f4aee2b1e-kube-api-access-fhfwr" (OuterVolumeSpecName: "kube-api-access-fhfwr") pod "20332822-244d-4511-985d-335f4aee2b1e" (UID: "20332822-244d-4511-985d-335f4aee2b1e"). InnerVolumeSpecName "kube-api-access-fhfwr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:51:52.399561 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.399479 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-util" (OuterVolumeSpecName: "util") pod "20332822-244d-4511-985d-335f4aee2b1e" (UID: "20332822-244d-4511-985d-335f4aee2b1e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:51:52.492866 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.492833 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:52.492866 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.492865 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhfwr\" (UniqueName: \"kubernetes.io/projected/20332822-244d-4511-985d-335f4aee2b1e-kube-api-access-fhfwr\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:52.493030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:52.492879 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/20332822-244d-4511-985d-335f4aee2b1e-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:51:53.099399 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:53.099371 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" Apr 22 18:51:53.099399 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:53.099372 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c97lccf" event={"ID":"20332822-244d-4511-985d-335f4aee2b1e","Type":"ContainerDied","Data":"c1ff6c92406ee2982ced7515b1c7a25e21e92cf34b836771bcabfd407b46801e"} Apr 22 18:51:53.100059 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:53.099417 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1ff6c92406ee2982ced7515b1c7a25e21e92cf34b836771bcabfd407b46801e" Apr 22 18:51:53.308743 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:53.308707 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f87480e-3ca4-4274-a2fc-92b5cb9daa00" path="/var/lib/kubelet/pods/2f87480e-3ca4-4274-a2fc-92b5cb9daa00/volumes" Apr 22 18:51:54.105533 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:54.105484 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" event={"ID":"4f72cd4e-5f6c-4405-a2ba-2700077a5303","Type":"ContainerStarted","Data":"2f5eb200133bf1a2a24be935d1a4ff082246659314bd3ba0adf2bf04dfbb87b1"} Apr 22 18:51:54.105998 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:54.105640 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:51:54.144909 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:51:54.144853 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" podStartSLOduration=1.692160956 podStartE2EDuration="4.14483926s" podCreationTimestamp="2026-04-22 18:51:50 +0000 UTC" firstStartedPulling="2026-04-22 18:51:51.160669235 +0000 UTC m=+574.448105408" lastFinishedPulling="2026-04-22 18:51:53.613347554 +0000 UTC m=+576.900783712" observedRunningTime="2026-04-22 18:51:54.142656336 +0000 UTC m=+577.430092513" watchObservedRunningTime="2026-04-22 18:51:54.14483926 +0000 UTC m=+577.432275438" Apr 22 18:52:05.115213 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:05.115181 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-dd89cc56c-6szw2" Apr 22 18:52:08.049566 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.049532 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m"] Apr 22 18:52:08.049930 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.049892 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20332822-244d-4511-985d-335f4aee2b1e" containerName="extract" Apr 22 18:52:08.049930 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.049903 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="20332822-244d-4511-985d-335f4aee2b1e" containerName="extract" Apr 22 18:52:08.049930 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.049915 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20332822-244d-4511-985d-335f4aee2b1e" containerName="util" Apr 22 18:52:08.049930 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.049921 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="20332822-244d-4511-985d-335f4aee2b1e" containerName="util" Apr 22 18:52:08.050075 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.049934 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20332822-244d-4511-985d-335f4aee2b1e" containerName="pull" Apr 22 18:52:08.050075 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.049940 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="20332822-244d-4511-985d-335f4aee2b1e" containerName="pull" Apr 22 18:52:08.050075 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.049950 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f87480e-3ca4-4274-a2fc-92b5cb9daa00" containerName="console" Apr 22 18:52:08.050075 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.049955 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f87480e-3ca4-4274-a2fc-92b5cb9daa00" containerName="console" Apr 22 18:52:08.050075 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.050008 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="20332822-244d-4511-985d-335f4aee2b1e" containerName="extract" Apr 22 18:52:08.050075 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.050014 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f87480e-3ca4-4274-a2fc-92b5cb9daa00" containerName="console" Apr 22 18:52:08.052843 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.052826 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.062763 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.062738 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 18:52:08.062763 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.062756 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 18:52:08.062933 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.062764 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 18:52:08.062933 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.062811 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-kp9vn\"" Apr 22 18:52:08.062933 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.062893 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:52:08.069895 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.069878 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 18:52:08.120442 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.120404 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ddafe4a-a370-476f-92f1-e4a10f75f714-metrics-cert\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.120647 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.120453 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgr6t\" (UniqueName: \"kubernetes.io/projected/0ddafe4a-a370-476f-92f1-e4a10f75f714-kube-api-access-pgr6t\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.120647 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.120472 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0ddafe4a-a370-476f-92f1-e4a10f75f714-manager-config\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.120647 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.120553 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ddafe4a-a370-476f-92f1-e4a10f75f714-cert\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.136120 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.136084 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m"] Apr 22 18:52:08.138311 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.138136 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w"] Apr 22 18:52:08.141814 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.141793 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.152129 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.152103 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:52:08.152228 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.152113 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:52:08.152296 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.152251 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9j2jm\"" Apr 22 18:52:08.200962 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.200933 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w"] Apr 22 18:52:08.221423 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.221395 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.221594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.221433 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vk4f\" (UniqueName: \"kubernetes.io/projected/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-kube-api-access-9vk4f\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.221594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.221475 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ddafe4a-a370-476f-92f1-e4a10f75f714-cert\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.221594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.221536 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ddafe4a-a370-476f-92f1-e4a10f75f714-metrics-cert\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.221594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.221584 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgr6t\" (UniqueName: \"kubernetes.io/projected/0ddafe4a-a370-476f-92f1-e4a10f75f714-kube-api-access-pgr6t\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.221780 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.221615 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0ddafe4a-a370-476f-92f1-e4a10f75f714-manager-config\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.221780 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.221642 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.222272 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.222245 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0ddafe4a-a370-476f-92f1-e4a10f75f714-manager-config\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.223959 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.223941 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0ddafe4a-a370-476f-92f1-e4a10f75f714-metrics-cert\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.224109 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.224089 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ddafe4a-a370-476f-92f1-e4a10f75f714-cert\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.254378 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.254349 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgr6t\" (UniqueName: \"kubernetes.io/projected/0ddafe4a-a370-476f-92f1-e4a10f75f714-kube-api-access-pgr6t\") pod \"lws-controller-manager-54dc496758-6kd5m\" (UID: \"0ddafe4a-a370-476f-92f1-e4a10f75f714\") " pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.322201 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.322102 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.322201 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.322141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vk4f\" (UniqueName: \"kubernetes.io/projected/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-kube-api-access-9vk4f\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.322201 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.322200 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.322638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.322580 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.322638 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.322595 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.351599 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.351565 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vk4f\" (UniqueName: \"kubernetes.io/projected/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-kube-api-access-9vk4f\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.362384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.362365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:08.451826 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.451799 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:08.493165 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.493139 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m"] Apr 22 18:52:08.495774 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:52:08.495724 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ddafe4a_a370_476f_92f1_e4a10f75f714.slice/crio-bce67c983f084fd5f821f4b00eca60211c829fbad23b1ee6577b8683c17ba682 WatchSource:0}: Error finding container bce67c983f084fd5f821f4b00eca60211c829fbad23b1ee6577b8683c17ba682: Status 404 returned error can't find the container with id bce67c983f084fd5f821f4b00eca60211c829fbad23b1ee6577b8683c17ba682 Apr 22 18:52:08.587012 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:08.586985 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w"] Apr 22 18:52:08.588361 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:52:08.588336 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4369bcc2_b1cf_4b4a_a4ba_daaae548aac0.slice/crio-01a0cbdfa53735ee8676c5931b855694cf23257d5481c55811f7e80ea2c5a28e WatchSource:0}: Error finding container 01a0cbdfa53735ee8676c5931b855694cf23257d5481c55811f7e80ea2c5a28e: Status 404 returned error can't find the container with id 01a0cbdfa53735ee8676c5931b855694cf23257d5481c55811f7e80ea2c5a28e Apr 22 18:52:09.163020 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:09.162978 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" event={"ID":"0ddafe4a-a370-476f-92f1-e4a10f75f714","Type":"ContainerStarted","Data":"bce67c983f084fd5f821f4b00eca60211c829fbad23b1ee6577b8683c17ba682"} Apr 22 18:52:09.164692 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:09.164651 2568 generic.go:358] "Generic (PLEG): container finished" podID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerID="8ded165bbb9a9f8f9d206cae62e2bce7d97cfbbd90fc14c51b52783675e68033" exitCode=0 Apr 22 18:52:09.164876 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:09.164701 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" event={"ID":"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0","Type":"ContainerDied","Data":"8ded165bbb9a9f8f9d206cae62e2bce7d97cfbbd90fc14c51b52783675e68033"} Apr 22 18:52:09.164876 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:09.164738 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" event={"ID":"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0","Type":"ContainerStarted","Data":"01a0cbdfa53735ee8676c5931b855694cf23257d5481c55811f7e80ea2c5a28e"} Apr 22 18:52:10.169536 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:10.169482 2568 generic.go:358] "Generic (PLEG): container finished" podID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerID="19f44013858f88893176c1103dd378b28e86e8cf610036bfa4adc06c910ab041" exitCode=0 Apr 22 18:52:10.169972 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:10.169573 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" event={"ID":"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0","Type":"ContainerDied","Data":"19f44013858f88893176c1103dd378b28e86e8cf610036bfa4adc06c910ab041"} Apr 22 18:52:10.171089 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:10.171064 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" event={"ID":"0ddafe4a-a370-476f-92f1-e4a10f75f714","Type":"ContainerStarted","Data":"9bf794f30a640bf8617cd14ec4a7b2c8b7a0c012c03b9fa865a8100cfe055d07"} Apr 22 18:52:10.171203 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:10.171149 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:10.231685 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:10.231629 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" podStartSLOduration=1.964939267 podStartE2EDuration="3.231609978s" podCreationTimestamp="2026-04-22 18:52:07 +0000 UTC" firstStartedPulling="2026-04-22 18:52:08.497715266 +0000 UTC m=+591.785151436" lastFinishedPulling="2026-04-22 18:52:09.764385987 +0000 UTC m=+593.051822147" observedRunningTime="2026-04-22 18:52:10.230752747 +0000 UTC m=+593.518188925" watchObservedRunningTime="2026-04-22 18:52:10.231609978 +0000 UTC m=+593.519046161" Apr 22 18:52:11.176912 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:11.176877 2568 generic.go:358] "Generic (PLEG): container finished" podID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerID="f994f4eb6efab435e70356c519dccc463b40b71b5a890e063fd7cbd3fb9691c2" exitCode=0 Apr 22 18:52:11.177262 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:11.176960 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" event={"ID":"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0","Type":"ContainerDied","Data":"f994f4eb6efab435e70356c519dccc463b40b71b5a890e063fd7cbd3fb9691c2"} Apr 22 18:52:12.304821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.304798 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:12.358144 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.358115 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-util\") pod \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " Apr 22 18:52:12.358144 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.358148 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-bundle\") pod \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " Apr 22 18:52:12.358367 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.358211 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vk4f\" (UniqueName: \"kubernetes.io/projected/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-kube-api-access-9vk4f\") pod \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\" (UID: \"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0\") " Apr 22 18:52:12.359036 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.359009 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-bundle" (OuterVolumeSpecName: "bundle") pod "4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" (UID: "4369bcc2-b1cf-4b4a-a4ba-daaae548aac0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:12.360275 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.360255 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-kube-api-access-9vk4f" (OuterVolumeSpecName: "kube-api-access-9vk4f") pod "4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" (UID: "4369bcc2-b1cf-4b4a-a4ba-daaae548aac0"). InnerVolumeSpecName "kube-api-access-9vk4f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:12.363611 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.363588 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-util" (OuterVolumeSpecName: "util") pod "4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" (UID: "4369bcc2-b1cf-4b4a-a4ba-daaae548aac0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:12.458971 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.458870 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:52:12.458971 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.458910 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:52:12.458971 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:12.458923 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vk4f\" (UniqueName: \"kubernetes.io/projected/4369bcc2-b1cf-4b4a-a4ba-daaae548aac0-kube-api-access-9vk4f\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:52:13.191785 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:13.191745 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" event={"ID":"4369bcc2-b1cf-4b4a-a4ba-daaae548aac0","Type":"ContainerDied","Data":"01a0cbdfa53735ee8676c5931b855694cf23257d5481c55811f7e80ea2c5a28e"} Apr 22 18:52:13.191785 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:13.191789 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01a0cbdfa53735ee8676c5931b855694cf23257d5481c55811f7e80ea2c5a28e" Apr 22 18:52:13.192036 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:13.191758 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hs85w" Apr 22 18:52:21.179325 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:21.179296 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-54dc496758-6kd5m" Apr 22 18:52:23.050066 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.050029 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7"] Apr 22 18:52:23.050463 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.050406 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerName="extract" Apr 22 18:52:23.050463 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.050417 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerName="extract" Apr 22 18:52:23.050463 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.050425 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerName="pull" Apr 22 18:52:23.050463 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.050430 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerName="pull" Apr 22 18:52:23.050463 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.050452 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerName="util" Apr 22 18:52:23.050463 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.050458 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerName="util" Apr 22 18:52:23.050684 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.050530 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="4369bcc2-b1cf-4b4a-a4ba-daaae548aac0" containerName="extract" Apr 22 18:52:23.055348 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.055330 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.058999 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.058978 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 18:52:23.059223 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.059205 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 18:52:23.059979 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.059965 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9j2jm\"" Apr 22 18:52:23.067330 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.067309 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7"] Apr 22 18:52:23.151139 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.151097 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.151322 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.151155 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whcvd\" (UniqueName: \"kubernetes.io/projected/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-kube-api-access-whcvd\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.151322 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.151250 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.252110 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.252001 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.252110 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.252045 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-whcvd\" (UniqueName: \"kubernetes.io/projected/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-kube-api-access-whcvd\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.252110 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.252079 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.272566 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.252470 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-bundle\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.272566 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.252485 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-util\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.272566 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.267547 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-whcvd\" (UniqueName: \"kubernetes.io/projected/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-kube-api-access-whcvd\") pod \"7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.364469 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.364389 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:23.522398 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:23.522363 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7"] Apr 22 18:52:23.523819 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:52:23.523796 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0a458ab_7bfc_4a6e_b0b0_03ceb297e3dc.slice/crio-54a1fde40ab2217b0d330405e3ffba51bbb42f5ba6d545faf79731baed6d6cb1 WatchSource:0}: Error finding container 54a1fde40ab2217b0d330405e3ffba51bbb42f5ba6d545faf79731baed6d6cb1: Status 404 returned error can't find the container with id 54a1fde40ab2217b0d330405e3ffba51bbb42f5ba6d545faf79731baed6d6cb1 Apr 22 18:52:24.234868 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:24.234835 2568 generic.go:358] "Generic (PLEG): container finished" podID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerID="d9531124823778ab0a9f1c241211b2b83ff991e312529f773111a64420d9e33c" exitCode=0 Apr 22 18:52:24.235337 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:24.234915 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" event={"ID":"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc","Type":"ContainerDied","Data":"d9531124823778ab0a9f1c241211b2b83ff991e312529f773111a64420d9e33c"} Apr 22 18:52:24.235337 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:24.234947 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" event={"ID":"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc","Type":"ContainerStarted","Data":"54a1fde40ab2217b0d330405e3ffba51bbb42f5ba6d545faf79731baed6d6cb1"} Apr 22 18:52:25.243604 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:25.243515 2568 generic.go:358] "Generic (PLEG): container finished" podID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerID="74da7808d44530e767262436f43b084bea0f9b4fc9b8a37cebc2cbc828672fb5" exitCode=0 Apr 22 18:52:25.243604 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:25.243539 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" event={"ID":"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc","Type":"ContainerDied","Data":"74da7808d44530e767262436f43b084bea0f9b4fc9b8a37cebc2cbc828672fb5"} Apr 22 18:52:26.250067 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:26.250031 2568 generic.go:358] "Generic (PLEG): container finished" podID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerID="c45df9cada00899e54d791c7c2784f78a515c85906ee17468b9927fe885a7dd2" exitCode=0 Apr 22 18:52:26.250467 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:26.250093 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" event={"ID":"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc","Type":"ContainerDied","Data":"c45df9cada00899e54d791c7c2784f78a515c85906ee17468b9927fe885a7dd2"} Apr 22 18:52:27.388649 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.388624 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:27.492870 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.492837 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whcvd\" (UniqueName: \"kubernetes.io/projected/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-kube-api-access-whcvd\") pod \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " Apr 22 18:52:27.493030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.492882 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-util\") pod \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " Apr 22 18:52:27.493030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.492973 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-bundle\") pod \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\" (UID: \"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc\") " Apr 22 18:52:27.493979 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.493948 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-bundle" (OuterVolumeSpecName: "bundle") pod "d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" (UID: "d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:27.495022 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.495000 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-kube-api-access-whcvd" (OuterVolumeSpecName: "kube-api-access-whcvd") pod "d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" (UID: "d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc"). InnerVolumeSpecName "kube-api-access-whcvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:52:27.498682 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.498648 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-util" (OuterVolumeSpecName: "util") pod "d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" (UID: "d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:52:27.593990 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.593964 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:52:27.593990 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.593989 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-whcvd\" (UniqueName: \"kubernetes.io/projected/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-kube-api-access-whcvd\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:52:27.594157 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:27.594000 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:52:28.259779 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:28.259751 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" Apr 22 18:52:28.259953 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:28.259749 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7797e60c2c2aaccf623f93d365c9b5c5cd3662e5f903c80e749ff805ebgwzs7" event={"ID":"d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc","Type":"ContainerDied","Data":"54a1fde40ab2217b0d330405e3ffba51bbb42f5ba6d545faf79731baed6d6cb1"} Apr 22 18:52:28.259953 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:52:28.259862 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54a1fde40ab2217b0d330405e3ffba51bbb42f5ba6d545faf79731baed6d6cb1" Apr 22 18:53:13.976996 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.976960 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79"] Apr 22 18:53:13.977394 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.977326 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerName="util" Apr 22 18:53:13.977394 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.977341 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerName="util" Apr 22 18:53:13.977394 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.977353 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerName="extract" Apr 22 18:53:13.977394 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.977359 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerName="extract" Apr 22 18:53:13.977394 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.977380 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerName="pull" Apr 22 18:53:13.977394 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.977386 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerName="pull" Apr 22 18:53:13.977634 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.977446 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0a458ab-7bfc-4a6e-b0b0-03ceb297e3dc" containerName="extract" Apr 22 18:53:13.980474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.980454 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:13.983017 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.982989 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:53:13.983141 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.983022 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 18:53:13.983141 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.983049 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-dm89z\"" Apr 22 18:53:13.983412 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.983397 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:53:13.991834 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:13.991811 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79"] Apr 22 18:53:14.099360 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.099331 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.099529 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.099364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d6592537-fd96-4b35-a36c-623383f391e5-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.099529 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.099397 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxw8w\" (UniqueName: \"kubernetes.io/projected/d6592537-fd96-4b35-a36c-623383f391e5-kube-api-access-xxw8w\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.099616 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.099549 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d6592537-fd96-4b35-a36c-623383f391e5-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.099653 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.099619 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.099691 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.099657 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d6592537-fd96-4b35-a36c-623383f391e5-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.099743 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.099726 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.099829 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.099762 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.099867 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.099823 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.200795 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.200758 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.200795 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.200793 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.200838 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.200859 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.200878 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d6592537-fd96-4b35-a36c-623383f391e5-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.200926 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxw8w\" (UniqueName: \"kubernetes.io/projected/d6592537-fd96-4b35-a36c-623383f391e5-kube-api-access-xxw8w\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.200989 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d6592537-fd96-4b35-a36c-623383f391e5-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201303 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.201039 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201303 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.201068 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d6592537-fd96-4b35-a36c-623383f391e5-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201443 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.201395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201527 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.201443 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201619 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.201598 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.201619 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.201609 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.202029 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.202004 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d6592537-fd96-4b35-a36c-623383f391e5-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.202988 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.202971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d6592537-fd96-4b35-a36c-623383f391e5-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.203358 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.203343 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d6592537-fd96-4b35-a36c-623383f391e5-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.209302 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.209267 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d6592537-fd96-4b35-a36c-623383f391e5-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.209419 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.209318 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxw8w\" (UniqueName: \"kubernetes.io/projected/d6592537-fd96-4b35-a36c-623383f391e5-kube-api-access-xxw8w\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f6zc79\" (UID: \"d6592537-fd96-4b35-a36c-623383f391e5\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.293726 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.293623 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:14.415058 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.415035 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79"] Apr 22 18:53:14.417791 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:53:14.417768 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6592537_fd96_4b35_a36c_623383f391e5.slice/crio-74415ee82a5e771c25d860be9efaa4d3e176e77cf39ba8beb6138a89589ea3a2 WatchSource:0}: Error finding container 74415ee82a5e771c25d860be9efaa4d3e176e77cf39ba8beb6138a89589ea3a2: Status 404 returned error can't find the container with id 74415ee82a5e771c25d860be9efaa4d3e176e77cf39ba8beb6138a89589ea3a2 Apr 22 18:53:14.423324 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:14.423294 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" event={"ID":"d6592537-fd96-4b35-a36c-623383f391e5","Type":"ContainerStarted","Data":"74415ee82a5e771c25d860be9efaa4d3e176e77cf39ba8beb6138a89589ea3a2"} Apr 22 18:53:16.716870 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:16.716829 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 18:53:16.717151 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:16.716905 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 18:53:16.717151 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:16.716932 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 18:53:17.437185 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:17.437145 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" event={"ID":"d6592537-fd96-4b35-a36c-623383f391e5","Type":"ContainerStarted","Data":"1ce16cd6221e30d10bbcd56e7e7d175782d9aeb7cdb19285c308f4b5ac5d4381"} Apr 22 18:53:17.456331 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:17.456250 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" podStartSLOduration=2.159464823 podStartE2EDuration="4.456233942s" podCreationTimestamp="2026-04-22 18:53:13 +0000 UTC" firstStartedPulling="2026-04-22 18:53:14.419803124 +0000 UTC m=+657.707239281" lastFinishedPulling="2026-04-22 18:53:16.716572241 +0000 UTC m=+660.004008400" observedRunningTime="2026-04-22 18:53:17.455012783 +0000 UTC m=+660.742448963" watchObservedRunningTime="2026-04-22 18:53:17.456233942 +0000 UTC m=+660.743670119" Apr 22 18:53:18.294021 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:18.293978 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:18.298485 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:18.298460 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:18.440819 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:18.440782 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:18.441751 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:18.441734 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f6zc79" Apr 22 18:53:23.435774 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.435722 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh"] Apr 22 18:53:23.439671 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.439653 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.442223 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.442201 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:53:23.443174 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.443148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xztn6\"" Apr 22 18:53:23.443288 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.443162 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:53:23.446612 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.446589 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh"] Apr 22 18:53:23.584950 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.584912 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxpnr\" (UniqueName: \"kubernetes.io/projected/2c232082-9fba-4b55-9a9d-03825ab46808-kube-api-access-fxpnr\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.585107 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.584957 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.585107 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.584979 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.686266 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.686162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxpnr\" (UniqueName: \"kubernetes.io/projected/2c232082-9fba-4b55-9a9d-03825ab46808-kube-api-access-fxpnr\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.686266 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.686223 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.686266 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.686255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.686622 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.686588 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.686710 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.686685 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.694771 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.694744 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxpnr\" (UniqueName: \"kubernetes.io/projected/2c232082-9fba-4b55-9a9d-03825ab46808-kube-api-access-fxpnr\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.750250 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.750216 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:23.834685 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.834655 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt"] Apr 22 18:53:23.842689 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.842659 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:23.847050 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.847019 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt"] Apr 22 18:53:23.876873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.876845 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh"] Apr 22 18:53:23.878127 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:53:23.878091 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c232082_9fba_4b55_9a9d_03825ab46808.slice/crio-9b3a5b14b0be6e83b7fe5b14401950720fb949a2f09bc82d444499a976464772 WatchSource:0}: Error finding container 9b3a5b14b0be6e83b7fe5b14401950720fb949a2f09bc82d444499a976464772: Status 404 returned error can't find the container with id 9b3a5b14b0be6e83b7fe5b14401950720fb949a2f09bc82d444499a976464772 Apr 22 18:53:23.989060 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.989031 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:23.989194 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.989084 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5bv5\" (UniqueName: \"kubernetes.io/projected/7e694063-aa04-400c-b495-0599ed097240-kube-api-access-w5bv5\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:23.989194 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:23.989139 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:24.090426 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.090385 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5bv5\" (UniqueName: \"kubernetes.io/projected/7e694063-aa04-400c-b495-0599ed097240-kube-api-access-w5bv5\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:24.090632 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.090443 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:24.090632 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.090546 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:24.090873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.090853 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:24.090873 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.090867 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:24.098786 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.098763 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5bv5\" (UniqueName: \"kubernetes.io/projected/7e694063-aa04-400c-b495-0599ed097240-kube-api-access-w5bv5\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:24.155098 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.155065 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:24.279142 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.279118 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt"] Apr 22 18:53:24.280848 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:53:24.280817 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e694063_aa04_400c_b495_0599ed097240.slice/crio-81c3b2471ed04e9c9752643ff552a883a3f2a2fe7e5749c957f9906449f15a8e WatchSource:0}: Error finding container 81c3b2471ed04e9c9752643ff552a883a3f2a2fe7e5749c957f9906449f15a8e: Status 404 returned error can't find the container with id 81c3b2471ed04e9c9752643ff552a883a3f2a2fe7e5749c957f9906449f15a8e Apr 22 18:53:24.448448 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.448419 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7"] Apr 22 18:53:24.452066 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.452045 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.463772 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.462823 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7"] Apr 22 18:53:24.471184 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.471152 2568 generic.go:358] "Generic (PLEG): container finished" podID="2c232082-9fba-4b55-9a9d-03825ab46808" containerID="eaedcd18c22e10c2b1966773549e4a66f2251f802a4b73f4d579c030ba236f51" exitCode=0 Apr 22 18:53:24.471316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.471231 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" event={"ID":"2c232082-9fba-4b55-9a9d-03825ab46808","Type":"ContainerDied","Data":"eaedcd18c22e10c2b1966773549e4a66f2251f802a4b73f4d579c030ba236f51"} Apr 22 18:53:24.471316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.471276 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" event={"ID":"2c232082-9fba-4b55-9a9d-03825ab46808","Type":"ContainerStarted","Data":"9b3a5b14b0be6e83b7fe5b14401950720fb949a2f09bc82d444499a976464772"} Apr 22 18:53:24.476548 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.476525 2568 generic.go:358] "Generic (PLEG): container finished" podID="7e694063-aa04-400c-b495-0599ed097240" containerID="404824547bcdc98b5baa4702fb85c68ed12d268a50b1384130b72af77b68b3a2" exitCode=0 Apr 22 18:53:24.476656 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.476631 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" event={"ID":"7e694063-aa04-400c-b495-0599ed097240","Type":"ContainerDied","Data":"404824547bcdc98b5baa4702fb85c68ed12d268a50b1384130b72af77b68b3a2"} Apr 22 18:53:24.476723 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.476667 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" event={"ID":"7e694063-aa04-400c-b495-0599ed097240","Type":"ContainerStarted","Data":"81c3b2471ed04e9c9752643ff552a883a3f2a2fe7e5749c957f9906449f15a8e"} Apr 22 18:53:24.595648 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.595616 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.595835 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.595675 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.595835 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.595735 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h5bj\" (UniqueName: \"kubernetes.io/projected/9286965c-ebc2-4e26-bac3-c03239bdf69d-kube-api-access-4h5bj\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.696423 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.696386 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.696599 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.696442 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.696599 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.696467 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h5bj\" (UniqueName: \"kubernetes.io/projected/9286965c-ebc2-4e26-bac3-c03239bdf69d-kube-api-access-4h5bj\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.696811 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.696786 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.696869 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.696846 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.705224 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.705200 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h5bj\" (UniqueName: \"kubernetes.io/projected/9286965c-ebc2-4e26-bac3-c03239bdf69d-kube-api-access-4h5bj\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.770607 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.770572 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:24.896208 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:24.896173 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7"] Apr 22 18:53:24.896993 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:53:24.896964 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9286965c_ebc2_4e26_bac3_c03239bdf69d.slice/crio-f003cc188331ddd9e4822c223ae4b3d1e496ce85178ba5f80eb3dd841244049c WatchSource:0}: Error finding container f003cc188331ddd9e4822c223ae4b3d1e496ce85178ba5f80eb3dd841244049c: Status 404 returned error can't find the container with id f003cc188331ddd9e4822c223ae4b3d1e496ce85178ba5f80eb3dd841244049c Apr 22 18:53:25.044420 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.044381 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8"] Apr 22 18:53:25.047882 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.047866 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.058586 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.058559 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8"] Apr 22 18:53:25.202130 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.202106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.202238 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.202137 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h76q8\" (UniqueName: \"kubernetes.io/projected/a0f7e806-cb23-46c1-8e64-e9a40b758857-kube-api-access-h76q8\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.202301 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.202279 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.302832 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.302794 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.302992 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.302890 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.302992 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.302920 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h76q8\" (UniqueName: \"kubernetes.io/projected/a0f7e806-cb23-46c1-8e64-e9a40b758857-kube-api-access-h76q8\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.303257 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.303225 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.303303 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.303235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.312521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.312482 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h76q8\" (UniqueName: \"kubernetes.io/projected/a0f7e806-cb23-46c1-8e64-e9a40b758857-kube-api-access-h76q8\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.357600 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.357522 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:25.483294 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.483261 2568 generic.go:358] "Generic (PLEG): container finished" podID="7e694063-aa04-400c-b495-0599ed097240" containerID="1dd2fc59f82f53ae1b4b6789b9d56e0b7db42faf519ac82630c508ecb895700a" exitCode=0 Apr 22 18:53:25.483760 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.483345 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" event={"ID":"7e694063-aa04-400c-b495-0599ed097240","Type":"ContainerDied","Data":"1dd2fc59f82f53ae1b4b6789b9d56e0b7db42faf519ac82630c508ecb895700a"} Apr 22 18:53:25.485030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.485008 2568 generic.go:358] "Generic (PLEG): container finished" podID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerID="451d805d953e6122fb8d60ceebd57c66779f4420aa499b12fefaaf32a0f37469" exitCode=0 Apr 22 18:53:25.485150 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.485108 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" event={"ID":"9286965c-ebc2-4e26-bac3-c03239bdf69d","Type":"ContainerDied","Data":"451d805d953e6122fb8d60ceebd57c66779f4420aa499b12fefaaf32a0f37469"} Apr 22 18:53:25.485150 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.485132 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" event={"ID":"9286965c-ebc2-4e26-bac3-c03239bdf69d","Type":"ContainerStarted","Data":"f003cc188331ddd9e4822c223ae4b3d1e496ce85178ba5f80eb3dd841244049c"} Apr 22 18:53:25.487056 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.487033 2568 generic.go:358] "Generic (PLEG): container finished" podID="2c232082-9fba-4b55-9a9d-03825ab46808" containerID="5769f7fd35be577851a2f734895d90f7d23b2dcadc9b89ad1f60a7a66e90a9f4" exitCode=0 Apr 22 18:53:25.487147 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.487064 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" event={"ID":"2c232082-9fba-4b55-9a9d-03825ab46808","Type":"ContainerDied","Data":"5769f7fd35be577851a2f734895d90f7d23b2dcadc9b89ad1f60a7a66e90a9f4"} Apr 22 18:53:25.500761 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:25.500721 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8"] Apr 22 18:53:25.504419 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:53:25.504396 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0f7e806_cb23_46c1_8e64_e9a40b758857.slice/crio-7d83b6cfec84b8880380385c6330cd9f29eb4f4c802a3428978597a22e112cea WatchSource:0}: Error finding container 7d83b6cfec84b8880380385c6330cd9f29eb4f4c802a3428978597a22e112cea: Status 404 returned error can't find the container with id 7d83b6cfec84b8880380385c6330cd9f29eb4f4c802a3428978597a22e112cea Apr 22 18:53:26.492173 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:26.492142 2568 generic.go:358] "Generic (PLEG): container finished" podID="2c232082-9fba-4b55-9a9d-03825ab46808" containerID="c40bf86e852159fc52e06d837c2a65d8a9b4a3892c4c00f9a70975006d5527b7" exitCode=0 Apr 22 18:53:26.492581 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:26.492275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" event={"ID":"2c232082-9fba-4b55-9a9d-03825ab46808","Type":"ContainerDied","Data":"c40bf86e852159fc52e06d837c2a65d8a9b4a3892c4c00f9a70975006d5527b7"} Apr 22 18:53:26.493972 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:26.493943 2568 generic.go:358] "Generic (PLEG): container finished" podID="7e694063-aa04-400c-b495-0599ed097240" containerID="24ed604862d2ed618441610dac12ff88842991f046c6280708333759924d31d9" exitCode=0 Apr 22 18:53:26.494086 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:26.494021 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" event={"ID":"7e694063-aa04-400c-b495-0599ed097240","Type":"ContainerDied","Data":"24ed604862d2ed618441610dac12ff88842991f046c6280708333759924d31d9"} Apr 22 18:53:26.495241 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:26.495220 2568 generic.go:358] "Generic (PLEG): container finished" podID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerID="70fd53bd3e69c86e050eeaa08edea07761c35c1157e7263a4c0d313a07733041" exitCode=0 Apr 22 18:53:26.495339 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:26.495290 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" event={"ID":"a0f7e806-cb23-46c1-8e64-e9a40b758857","Type":"ContainerDied","Data":"70fd53bd3e69c86e050eeaa08edea07761c35c1157e7263a4c0d313a07733041"} Apr 22 18:53:26.495339 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:26.495312 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" event={"ID":"a0f7e806-cb23-46c1-8e64-e9a40b758857","Type":"ContainerStarted","Data":"7d83b6cfec84b8880380385c6330cd9f29eb4f4c802a3428978597a22e112cea"} Apr 22 18:53:26.496901 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:26.496881 2568 generic.go:358] "Generic (PLEG): container finished" podID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerID="396d2ea27910eaa6596303889874ec2a6ad062cabbf81bcc77d0fd91d2106e04" exitCode=0 Apr 22 18:53:26.496987 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:26.496915 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" event={"ID":"9286965c-ebc2-4e26-bac3-c03239bdf69d","Type":"ContainerDied","Data":"396d2ea27910eaa6596303889874ec2a6ad062cabbf81bcc77d0fd91d2106e04"} Apr 22 18:53:27.502325 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.502287 2568 generic.go:358] "Generic (PLEG): container finished" podID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerID="9b2f8ba80822615558cae86ad66a41a5392bb9f84e65064718400c4bb226d81d" exitCode=0 Apr 22 18:53:27.502752 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.502379 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" event={"ID":"a0f7e806-cb23-46c1-8e64-e9a40b758857","Type":"ContainerDied","Data":"9b2f8ba80822615558cae86ad66a41a5392bb9f84e65064718400c4bb226d81d"} Apr 22 18:53:27.504341 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.504315 2568 generic.go:358] "Generic (PLEG): container finished" podID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerID="4fc414fefb999620c4126e6bd8f7fa6a40ee4abf26758476b80e1a249b1b99d9" exitCode=0 Apr 22 18:53:27.504341 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.504330 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" event={"ID":"9286965c-ebc2-4e26-bac3-c03239bdf69d","Type":"ContainerDied","Data":"4fc414fefb999620c4126e6bd8f7fa6a40ee4abf26758476b80e1a249b1b99d9"} Apr 22 18:53:27.639639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.639616 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:27.708293 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.708269 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:27.725203 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.725138 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-util\") pod \"7e694063-aa04-400c-b495-0599ed097240\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " Apr 22 18:53:27.725203 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.725189 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5bv5\" (UniqueName: \"kubernetes.io/projected/7e694063-aa04-400c-b495-0599ed097240-kube-api-access-w5bv5\") pod \"7e694063-aa04-400c-b495-0599ed097240\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " Apr 22 18:53:27.725420 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.725317 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-bundle\") pod \"7e694063-aa04-400c-b495-0599ed097240\" (UID: \"7e694063-aa04-400c-b495-0599ed097240\") " Apr 22 18:53:27.726011 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.725952 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-bundle" (OuterVolumeSpecName: "bundle") pod "7e694063-aa04-400c-b495-0599ed097240" (UID: "7e694063-aa04-400c-b495-0599ed097240"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:27.727878 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.727849 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e694063-aa04-400c-b495-0599ed097240-kube-api-access-w5bv5" (OuterVolumeSpecName: "kube-api-access-w5bv5") pod "7e694063-aa04-400c-b495-0599ed097240" (UID: "7e694063-aa04-400c-b495-0599ed097240"). InnerVolumeSpecName "kube-api-access-w5bv5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:27.731341 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.731319 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-util" (OuterVolumeSpecName: "util") pod "7e694063-aa04-400c-b495-0599ed097240" (UID: "7e694063-aa04-400c-b495-0599ed097240"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:27.826278 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.826243 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-util\") pod \"2c232082-9fba-4b55-9a9d-03825ab46808\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " Apr 22 18:53:27.826437 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.826311 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxpnr\" (UniqueName: \"kubernetes.io/projected/2c232082-9fba-4b55-9a9d-03825ab46808-kube-api-access-fxpnr\") pod \"2c232082-9fba-4b55-9a9d-03825ab46808\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " Apr 22 18:53:27.826437 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.826331 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-bundle\") pod \"2c232082-9fba-4b55-9a9d-03825ab46808\" (UID: \"2c232082-9fba-4b55-9a9d-03825ab46808\") " Apr 22 18:53:27.826595 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.826577 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:27.826635 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.826604 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7e694063-aa04-400c-b495-0599ed097240-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:27.826635 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.826620 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5bv5\" (UniqueName: \"kubernetes.io/projected/7e694063-aa04-400c-b495-0599ed097240-kube-api-access-w5bv5\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:27.827009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.826983 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-bundle" (OuterVolumeSpecName: "bundle") pod "2c232082-9fba-4b55-9a9d-03825ab46808" (UID: "2c232082-9fba-4b55-9a9d-03825ab46808"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:27.828350 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.828332 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c232082-9fba-4b55-9a9d-03825ab46808-kube-api-access-fxpnr" (OuterVolumeSpecName: "kube-api-access-fxpnr") pod "2c232082-9fba-4b55-9a9d-03825ab46808" (UID: "2c232082-9fba-4b55-9a9d-03825ab46808"). InnerVolumeSpecName "kube-api-access-fxpnr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:27.831529 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.831487 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-util" (OuterVolumeSpecName: "util") pod "2c232082-9fba-4b55-9a9d-03825ab46808" (UID: "2c232082-9fba-4b55-9a9d-03825ab46808"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:27.927892 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.927817 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fxpnr\" (UniqueName: \"kubernetes.io/projected/2c232082-9fba-4b55-9a9d-03825ab46808-kube-api-access-fxpnr\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:27.927892 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.927845 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:27.927892 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:27.927856 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2c232082-9fba-4b55-9a9d-03825ab46808-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:28.509539 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.509490 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" event={"ID":"7e694063-aa04-400c-b495-0599ed097240","Type":"ContainerDied","Data":"81c3b2471ed04e9c9752643ff552a883a3f2a2fe7e5749c957f9906449f15a8e"} Apr 22 18:53:28.509539 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.509541 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81c3b2471ed04e9c9752643ff552a883a3f2a2fe7e5749c957f9906449f15a8e" Apr 22 18:53:28.509539 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.509514 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt" Apr 22 18:53:28.511330 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.511305 2568 generic.go:358] "Generic (PLEG): container finished" podID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerID="beb9d447488b77a9397a21e4db3c5d3bf23d767c1cf5175bf101965f72132121" exitCode=0 Apr 22 18:53:28.511463 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.511392 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" event={"ID":"a0f7e806-cb23-46c1-8e64-e9a40b758857","Type":"ContainerDied","Data":"beb9d447488b77a9397a21e4db3c5d3bf23d767c1cf5175bf101965f72132121"} Apr 22 18:53:28.513121 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.513093 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" Apr 22 18:53:28.513242 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.513120 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh" event={"ID":"2c232082-9fba-4b55-9a9d-03825ab46808","Type":"ContainerDied","Data":"9b3a5b14b0be6e83b7fe5b14401950720fb949a2f09bc82d444499a976464772"} Apr 22 18:53:28.513242 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.513144 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b3a5b14b0be6e83b7fe5b14401950720fb949a2f09bc82d444499a976464772" Apr 22 18:53:28.638761 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.638736 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:28.734165 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.734127 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-bundle\") pod \"9286965c-ebc2-4e26-bac3-c03239bdf69d\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " Apr 22 18:53:28.734326 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.734209 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h5bj\" (UniqueName: \"kubernetes.io/projected/9286965c-ebc2-4e26-bac3-c03239bdf69d-kube-api-access-4h5bj\") pod \"9286965c-ebc2-4e26-bac3-c03239bdf69d\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " Apr 22 18:53:28.734326 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.734244 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-util\") pod \"9286965c-ebc2-4e26-bac3-c03239bdf69d\" (UID: \"9286965c-ebc2-4e26-bac3-c03239bdf69d\") " Apr 22 18:53:28.734684 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.734653 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-bundle" (OuterVolumeSpecName: "bundle") pod "9286965c-ebc2-4e26-bac3-c03239bdf69d" (UID: "9286965c-ebc2-4e26-bac3-c03239bdf69d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:28.736417 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.736391 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9286965c-ebc2-4e26-bac3-c03239bdf69d-kube-api-access-4h5bj" (OuterVolumeSpecName: "kube-api-access-4h5bj") pod "9286965c-ebc2-4e26-bac3-c03239bdf69d" (UID: "9286965c-ebc2-4e26-bac3-c03239bdf69d"). InnerVolumeSpecName "kube-api-access-4h5bj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:28.739521 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.739471 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-util" (OuterVolumeSpecName: "util") pod "9286965c-ebc2-4e26-bac3-c03239bdf69d" (UID: "9286965c-ebc2-4e26-bac3-c03239bdf69d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:28.834856 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.834819 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:28.834856 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.834850 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9286965c-ebc2-4e26-bac3-c03239bdf69d-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:28.834856 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:28.834860 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4h5bj\" (UniqueName: \"kubernetes.io/projected/9286965c-ebc2-4e26-bac3-c03239bdf69d-kube-api-access-4h5bj\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:29.518291 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.518257 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" event={"ID":"9286965c-ebc2-4e26-bac3-c03239bdf69d","Type":"ContainerDied","Data":"f003cc188331ddd9e4822c223ae4b3d1e496ce85178ba5f80eb3dd841244049c"} Apr 22 18:53:29.518291 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.518286 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7" Apr 22 18:53:29.518749 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.518294 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f003cc188331ddd9e4822c223ae4b3d1e496ce85178ba5f80eb3dd841244049c" Apr 22 18:53:29.643152 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.643127 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:29.742545 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.742492 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-util\") pod \"a0f7e806-cb23-46c1-8e64-e9a40b758857\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " Apr 22 18:53:29.742735 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.742675 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-bundle\") pod \"a0f7e806-cb23-46c1-8e64-e9a40b758857\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " Apr 22 18:53:29.742735 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.742701 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h76q8\" (UniqueName: \"kubernetes.io/projected/a0f7e806-cb23-46c1-8e64-e9a40b758857-kube-api-access-h76q8\") pod \"a0f7e806-cb23-46c1-8e64-e9a40b758857\" (UID: \"a0f7e806-cb23-46c1-8e64-e9a40b758857\") " Apr 22 18:53:29.743243 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.743216 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-bundle" (OuterVolumeSpecName: "bundle") pod "a0f7e806-cb23-46c1-8e64-e9a40b758857" (UID: "a0f7e806-cb23-46c1-8e64-e9a40b758857"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:29.744874 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.744847 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0f7e806-cb23-46c1-8e64-e9a40b758857-kube-api-access-h76q8" (OuterVolumeSpecName: "kube-api-access-h76q8") pod "a0f7e806-cb23-46c1-8e64-e9a40b758857" (UID: "a0f7e806-cb23-46c1-8e64-e9a40b758857"). InnerVolumeSpecName "kube-api-access-h76q8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:53:29.748099 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.748078 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-util" (OuterVolumeSpecName: "util") pod "a0f7e806-cb23-46c1-8e64-e9a40b758857" (UID: "a0f7e806-cb23-46c1-8e64-e9a40b758857"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:53:29.844213 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.844119 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-util\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:29.844213 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.844155 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a0f7e806-cb23-46c1-8e64-e9a40b758857-bundle\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:29.844213 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:29.844167 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h76q8\" (UniqueName: \"kubernetes.io/projected/a0f7e806-cb23-46c1-8e64-e9a40b758857-kube-api-access-h76q8\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:53:30.523810 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:30.523777 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" Apr 22 18:53:30.524372 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:30.523766 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8" event={"ID":"a0f7e806-cb23-46c1-8e64-e9a40b758857","Type":"ContainerDied","Data":"7d83b6cfec84b8880380385c6330cd9f29eb4f4c802a3428978597a22e112cea"} Apr 22 18:53:30.524372 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:30.523882 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d83b6cfec84b8880380385c6330cd9f29eb4f4c802a3428978597a22e112cea" Apr 22 18:53:41.476512 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.476467 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9"] Apr 22 18:53:41.476991 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.476972 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c232082-9fba-4b55-9a9d-03825ab46808" containerName="util" Apr 22 18:53:41.477072 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.476994 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c232082-9fba-4b55-9a9d-03825ab46808" containerName="util" Apr 22 18:53:41.477072 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477005 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e694063-aa04-400c-b495-0599ed097240" containerName="extract" Apr 22 18:53:41.477072 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477013 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e694063-aa04-400c-b495-0599ed097240" containerName="extract" Apr 22 18:53:41.477072 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477025 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerName="util" Apr 22 18:53:41.477072 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477034 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerName="util" Apr 22 18:53:41.477072 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477047 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerName="extract" Apr 22 18:53:41.477072 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477056 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerName="extract" Apr 22 18:53:41.477072 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477065 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerName="extract" Apr 22 18:53:41.477072 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477073 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerName="extract" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477095 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerName="util" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477104 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerName="util" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477122 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e694063-aa04-400c-b495-0599ed097240" containerName="pull" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477129 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e694063-aa04-400c-b495-0599ed097240" containerName="pull" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477145 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c232082-9fba-4b55-9a9d-03825ab46808" containerName="pull" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477153 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c232082-9fba-4b55-9a9d-03825ab46808" containerName="pull" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477165 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerName="pull" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477172 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerName="pull" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477185 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e694063-aa04-400c-b495-0599ed097240" containerName="util" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477193 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e694063-aa04-400c-b495-0599ed097240" containerName="util" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477202 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c232082-9fba-4b55-9a9d-03825ab46808" containerName="extract" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477212 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c232082-9fba-4b55-9a9d-03825ab46808" containerName="extract" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477221 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerName="pull" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477229 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerName="pull" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477311 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c232082-9fba-4b55-9a9d-03825ab46808" containerName="extract" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477327 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="9286965c-ebc2-4e26-bac3-c03239bdf69d" containerName="extract" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477337 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0f7e806-cb23-46c1-8e64-e9a40b758857" containerName="extract" Apr 22 18:53:41.477474 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.477351 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e694063-aa04-400c-b495-0599ed097240" containerName="extract" Apr 22 18:53:41.480691 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.480668 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:53:41.485067 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.485048 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 18:53:41.485148 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.485066 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-brdcv\"" Apr 22 18:53:41.485200 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.485182 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 18:53:41.495304 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.495283 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9"] Apr 22 18:53:41.654675 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.654643 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e0948a9d-7d75-48d1-9d30-540fc3a53194-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gbrb9\" (UID: \"e0948a9d-7d75-48d1-9d30-540fc3a53194\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:53:41.654849 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.654788 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f86xl\" (UniqueName: \"kubernetes.io/projected/e0948a9d-7d75-48d1-9d30-540fc3a53194-kube-api-access-f86xl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gbrb9\" (UID: \"e0948a9d-7d75-48d1-9d30-540fc3a53194\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:53:41.755432 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.755333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f86xl\" (UniqueName: \"kubernetes.io/projected/e0948a9d-7d75-48d1-9d30-540fc3a53194-kube-api-access-f86xl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gbrb9\" (UID: \"e0948a9d-7d75-48d1-9d30-540fc3a53194\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:53:41.755432 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.755392 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e0948a9d-7d75-48d1-9d30-540fc3a53194-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gbrb9\" (UID: \"e0948a9d-7d75-48d1-9d30-540fc3a53194\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:53:41.755765 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.755736 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e0948a9d-7d75-48d1-9d30-540fc3a53194-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gbrb9\" (UID: \"e0948a9d-7d75-48d1-9d30-540fc3a53194\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:53:41.764862 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.764837 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f86xl\" (UniqueName: \"kubernetes.io/projected/e0948a9d-7d75-48d1-9d30-540fc3a53194-kube-api-access-f86xl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gbrb9\" (UID: \"e0948a9d-7d75-48d1-9d30-540fc3a53194\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:53:41.790680 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.790654 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:53:41.917306 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:41.917284 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9"] Apr 22 18:53:41.919194 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:53:41.919168 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0948a9d_7d75_48d1_9d30_540fc3a53194.slice/crio-1732b0aef6df4d7b9ba37fa2231d32720a7bb6e7bf18dd072054fadc2a34f7c5 WatchSource:0}: Error finding container 1732b0aef6df4d7b9ba37fa2231d32720a7bb6e7bf18dd072054fadc2a34f7c5: Status 404 returned error can't find the container with id 1732b0aef6df4d7b9ba37fa2231d32720a7bb6e7bf18dd072054fadc2a34f7c5 Apr 22 18:53:42.572004 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:42.571972 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" event={"ID":"e0948a9d-7d75-48d1-9d30-540fc3a53194","Type":"ContainerStarted","Data":"1732b0aef6df4d7b9ba37fa2231d32720a7bb6e7bf18dd072054fadc2a34f7c5"} Apr 22 18:53:47.593595 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:47.593516 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" event={"ID":"e0948a9d-7d75-48d1-9d30-540fc3a53194","Type":"ContainerStarted","Data":"eddd775e73c0c78a928621949c4d48e7704b50e1a7ade53ed17780d005b2b010"} Apr 22 18:53:47.593942 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:47.593637 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:53:47.619814 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:47.619771 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" podStartSLOduration=1.209601568 podStartE2EDuration="6.619743659s" podCreationTimestamp="2026-04-22 18:53:41 +0000 UTC" firstStartedPulling="2026-04-22 18:53:41.921349606 +0000 UTC m=+685.208785761" lastFinishedPulling="2026-04-22 18:53:47.331491696 +0000 UTC m=+690.618927852" observedRunningTime="2026-04-22 18:53:47.617610251 +0000 UTC m=+690.905046439" watchObservedRunningTime="2026-04-22 18:53:47.619743659 +0000 UTC m=+690.907179837" Apr 22 18:53:49.818921 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:49.818879 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qct57"] Apr 22 18:53:49.822375 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:49.822355 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-qct57" Apr 22 18:53:49.824869 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:49.824848 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-hpkpf\"" Apr 22 18:53:49.839118 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:49.839090 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qct57"] Apr 22 18:53:49.932786 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:49.932749 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wstg9\" (UniqueName: \"kubernetes.io/projected/b752037c-f7bc-4322-b5aa-7ff269e24c25-kube-api-access-wstg9\") pod \"authorino-operator-657f44b778-qct57\" (UID: \"b752037c-f7bc-4322-b5aa-7ff269e24c25\") " pod="kuadrant-system/authorino-operator-657f44b778-qct57" Apr 22 18:53:50.034120 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.034076 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wstg9\" (UniqueName: \"kubernetes.io/projected/b752037c-f7bc-4322-b5aa-7ff269e24c25-kube-api-access-wstg9\") pod \"authorino-operator-657f44b778-qct57\" (UID: \"b752037c-f7bc-4322-b5aa-7ff269e24c25\") " pod="kuadrant-system/authorino-operator-657f44b778-qct57" Apr 22 18:53:50.050917 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.050889 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wstg9\" (UniqueName: \"kubernetes.io/projected/b752037c-f7bc-4322-b5aa-7ff269e24c25-kube-api-access-wstg9\") pod \"authorino-operator-657f44b778-qct57\" (UID: \"b752037c-f7bc-4322-b5aa-7ff269e24c25\") " pod="kuadrant-system/authorino-operator-657f44b778-qct57" Apr 22 18:53:50.140260 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.132376 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-qct57" Apr 22 18:53:50.140260 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.133818 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx"] Apr 22 18:53:50.140260 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.140091 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.143185 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.143163 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xztn6\"" Apr 22 18:53:50.143620 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.143602 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 18:53:50.143730 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.143710 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 18:53:50.158566 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.158491 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx"] Apr 22 18:53:50.241361 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.241326 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/36212567-0437-4616-8bf3-8a7c2aecc713-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.241551 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.241369 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/36212567-0437-4616-8bf3-8a7c2aecc713-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.241551 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.241395 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m44bg\" (UniqueName: \"kubernetes.io/projected/36212567-0437-4616-8bf3-8a7c2aecc713-kube-api-access-m44bg\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.342356 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.342317 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/36212567-0437-4616-8bf3-8a7c2aecc713-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.342356 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.342353 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/36212567-0437-4616-8bf3-8a7c2aecc713-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.342641 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.342368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m44bg\" (UniqueName: \"kubernetes.io/projected/36212567-0437-4616-8bf3-8a7c2aecc713-kube-api-access-m44bg\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.342641 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:53:50.342452 2568 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 18:53:50.342641 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:53:50.342579 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36212567-0437-4616-8bf3-8a7c2aecc713-plugin-serving-cert podName:36212567-0437-4616-8bf3-8a7c2aecc713 nodeName:}" failed. No retries permitted until 2026-04-22 18:53:50.842536467 +0000 UTC m=+694.129972633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/36212567-0437-4616-8bf3-8a7c2aecc713-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-vhnlx" (UID: "36212567-0437-4616-8bf3-8a7c2aecc713") : secret "plugin-serving-cert" not found Apr 22 18:53:50.343116 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.343088 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/36212567-0437-4616-8bf3-8a7c2aecc713-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.352870 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.352848 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m44bg\" (UniqueName: \"kubernetes.io/projected/36212567-0437-4616-8bf3-8a7c2aecc713-kube-api-access-m44bg\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.481159 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.481124 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qct57"] Apr 22 18:53:50.482749 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:53:50.482719 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb752037c_f7bc_4322_b5aa_7ff269e24c25.slice/crio-2f19334f8c04d91fade4afcc464728180288b98cd00358b0a8daa3f31c08d69c WatchSource:0}: Error finding container 2f19334f8c04d91fade4afcc464728180288b98cd00358b0a8daa3f31c08d69c: Status 404 returned error can't find the container with id 2f19334f8c04d91fade4afcc464728180288b98cd00358b0a8daa3f31c08d69c Apr 22 18:53:50.605959 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.605919 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-qct57" event={"ID":"b752037c-f7bc-4322-b5aa-7ff269e24c25","Type":"ContainerStarted","Data":"2f19334f8c04d91fade4afcc464728180288b98cd00358b0a8daa3f31c08d69c"} Apr 22 18:53:50.848151 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.848114 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/36212567-0437-4616-8bf3-8a7c2aecc713-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:50.850414 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:50.850388 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/36212567-0437-4616-8bf3-8a7c2aecc713-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-vhnlx\" (UID: \"36212567-0437-4616-8bf3-8a7c2aecc713\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:51.068320 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:51.068284 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" Apr 22 18:53:51.193988 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:51.193960 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx"] Apr 22 18:53:51.195564 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:53:51.195531 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36212567_0437_4616_8bf3_8a7c2aecc713.slice/crio-9d99f67847c8812be075d4433f271406496ed962d61c5ecbb3f49108563ddfeb WatchSource:0}: Error finding container 9d99f67847c8812be075d4433f271406496ed962d61c5ecbb3f49108563ddfeb: Status 404 returned error can't find the container with id 9d99f67847c8812be075d4433f271406496ed962d61c5ecbb3f49108563ddfeb Apr 22 18:53:51.611833 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:51.611797 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" event={"ID":"36212567-0437-4616-8bf3-8a7c2aecc713","Type":"ContainerStarted","Data":"9d99f67847c8812be075d4433f271406496ed962d61c5ecbb3f49108563ddfeb"} Apr 22 18:53:53.622730 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:53.622684 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-qct57" event={"ID":"b752037c-f7bc-4322-b5aa-7ff269e24c25","Type":"ContainerStarted","Data":"c054aaf0fee401ddabf74b9db793ac23f9eca2834a09f826bbd8ee7b7f60810b"} Apr 22 18:53:53.623175 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:53.623005 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-qct57" Apr 22 18:53:53.645893 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:53.645773 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-qct57" podStartSLOduration=2.114475738 podStartE2EDuration="4.645756461s" podCreationTimestamp="2026-04-22 18:53:49 +0000 UTC" firstStartedPulling="2026-04-22 18:53:50.484654511 +0000 UTC m=+693.772090666" lastFinishedPulling="2026-04-22 18:53:53.015935216 +0000 UTC m=+696.303371389" observedRunningTime="2026-04-22 18:53:53.641955821 +0000 UTC m=+696.929392001" watchObservedRunningTime="2026-04-22 18:53:53.645756461 +0000 UTC m=+696.933192638" Apr 22 18:53:58.600921 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:53:58.600888 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:54:00.461422 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.461379 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9"] Apr 22 18:54:00.461834 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.461600 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" podUID="e0948a9d-7d75-48d1-9d30-540fc3a53194" containerName="manager" containerID="cri-o://eddd775e73c0c78a928621949c4d48e7704b50e1a7ade53ed17780d005b2b010" gracePeriod=2 Apr 22 18:54:00.476381 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.476355 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9"] Apr 22 18:54:00.488515 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.488478 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b"] Apr 22 18:54:00.488884 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.488859 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e0948a9d-7d75-48d1-9d30-540fc3a53194" containerName="manager" Apr 22 18:54:00.488884 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.488875 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0948a9d-7d75-48d1-9d30-540fc3a53194" containerName="manager" Apr 22 18:54:00.489026 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.488956 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="e0948a9d-7d75-48d1-9d30-540fc3a53194" containerName="manager" Apr 22 18:54:00.495464 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.495435 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:00.503751 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.503716 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b"] Apr 22 18:54:00.519639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.519607 2568 status_manager.go:895] "Failed to get status for pod" podUID="e0948a9d-7d75-48d1-9d30-540fc3a53194" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-gbrb9\" is forbidden: User \"system:node:ip-10-0-137-223.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-223.ec2.internal' and this object" Apr 22 18:54:00.645375 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.645305 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d728b\" (UID: \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:00.645540 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.645447 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg9jn\" (UniqueName: \"kubernetes.io/projected/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-kube-api-access-pg9jn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d728b\" (UID: \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:00.656584 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.656538 2568 generic.go:358] "Generic (PLEG): container finished" podID="e0948a9d-7d75-48d1-9d30-540fc3a53194" containerID="eddd775e73c0c78a928621949c4d48e7704b50e1a7ade53ed17780d005b2b010" exitCode=0 Apr 22 18:54:00.705053 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.705031 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:54:00.707372 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.707339 2568 status_manager.go:895] "Failed to get status for pod" podUID="e0948a9d-7d75-48d1-9d30-540fc3a53194" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-gbrb9\" is forbidden: User \"system:node:ip-10-0-137-223.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-223.ec2.internal' and this object" Apr 22 18:54:00.746913 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.746848 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg9jn\" (UniqueName: \"kubernetes.io/projected/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-kube-api-access-pg9jn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d728b\" (UID: \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:00.747021 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.746955 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d728b\" (UID: \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:00.747316 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.747296 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d728b\" (UID: \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:00.754924 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.754894 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg9jn\" (UniqueName: \"kubernetes.io/projected/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-kube-api-access-pg9jn\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d728b\" (UID: \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:00.834886 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.834855 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:00.847844 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.847805 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f86xl\" (UniqueName: \"kubernetes.io/projected/e0948a9d-7d75-48d1-9d30-540fc3a53194-kube-api-access-f86xl\") pod \"e0948a9d-7d75-48d1-9d30-540fc3a53194\" (UID: \"e0948a9d-7d75-48d1-9d30-540fc3a53194\") " Apr 22 18:54:00.847972 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.847949 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e0948a9d-7d75-48d1-9d30-540fc3a53194-extensions-socket-volume\") pod \"e0948a9d-7d75-48d1-9d30-540fc3a53194\" (UID: \"e0948a9d-7d75-48d1-9d30-540fc3a53194\") " Apr 22 18:54:00.848598 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.848558 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0948a9d-7d75-48d1-9d30-540fc3a53194-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e0948a9d-7d75-48d1-9d30-540fc3a53194" (UID: "e0948a9d-7d75-48d1-9d30-540fc3a53194"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:00.849836 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.849804 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0948a9d-7d75-48d1-9d30-540fc3a53194-kube-api-access-f86xl" (OuterVolumeSpecName: "kube-api-access-f86xl") pod "e0948a9d-7d75-48d1-9d30-540fc3a53194" (UID: "e0948a9d-7d75-48d1-9d30-540fc3a53194"). InnerVolumeSpecName "kube-api-access-f86xl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:00.949202 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.949171 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f86xl\" (UniqueName: \"kubernetes.io/projected/e0948a9d-7d75-48d1-9d30-540fc3a53194-kube-api-access-f86xl\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:54:00.949202 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.949201 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e0948a9d-7d75-48d1-9d30-540fc3a53194-extensions-socket-volume\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:54:00.974335 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:00.974306 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b"] Apr 22 18:54:00.975464 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:54:00.975438 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99e0d1d8_2bd3_4130_9ca5_a06e17c697b6.slice/crio-7c50925230a2546851e7b0cfd9f51300e562c3c1df81a91cf61d73a1c4c82db2 WatchSource:0}: Error finding container 7c50925230a2546851e7b0cfd9f51300e562c3c1df81a91cf61d73a1c4c82db2: Status 404 returned error can't find the container with id 7c50925230a2546851e7b0cfd9f51300e562c3c1df81a91cf61d73a1c4c82db2 Apr 22 18:54:01.307336 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:01.307297 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0948a9d-7d75-48d1-9d30-540fc3a53194" path="/var/lib/kubelet/pods/e0948a9d-7d75-48d1-9d30-540fc3a53194/volumes" Apr 22 18:54:01.662575 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:01.662463 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gbrb9" Apr 22 18:54:01.662575 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:01.662532 2568 scope.go:117] "RemoveContainer" containerID="eddd775e73c0c78a928621949c4d48e7704b50e1a7ade53ed17780d005b2b010" Apr 22 18:54:01.664401 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:01.664341 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" event={"ID":"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6","Type":"ContainerStarted","Data":"fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25"} Apr 22 18:54:01.664401 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:01.664381 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" event={"ID":"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6","Type":"ContainerStarted","Data":"7c50925230a2546851e7b0cfd9f51300e562c3c1df81a91cf61d73a1c4c82db2"} Apr 22 18:54:01.665084 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:01.665061 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:01.693060 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:01.692999 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" podStartSLOduration=1.692979408 podStartE2EDuration="1.692979408s" podCreationTimestamp="2026-04-22 18:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:54:01.68917573 +0000 UTC m=+704.976611907" watchObservedRunningTime="2026-04-22 18:54:01.692979408 +0000 UTC m=+704.980415586" Apr 22 18:54:04.629999 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:04.629966 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-qct57" Apr 22 18:54:13.676308 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:13.676223 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:16.733239 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:16.733199 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" event={"ID":"36212567-0437-4616-8bf3-8a7c2aecc713","Type":"ContainerStarted","Data":"171f5a818791830c41efb2fbcb9c3801a63337e356ec8f55b690ff6774b1375e"} Apr 22 18:54:16.751137 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:16.751081 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-vhnlx" podStartSLOduration=2.265640633 podStartE2EDuration="26.751063435s" podCreationTimestamp="2026-04-22 18:53:50 +0000 UTC" firstStartedPulling="2026-04-22 18:53:51.197183419 +0000 UTC m=+694.484619574" lastFinishedPulling="2026-04-22 18:54:15.682606216 +0000 UTC m=+718.970042376" observedRunningTime="2026-04-22 18:54:16.748305405 +0000 UTC m=+720.035741588" watchObservedRunningTime="2026-04-22 18:54:16.751063435 +0000 UTC m=+720.038499614" Apr 22 18:54:17.297718 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:17.297682 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b"] Apr 22 18:54:17.297908 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:17.297882 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" podUID="99e0d1d8-2bd3-4130-9ca5-a06e17c697b6" containerName="manager" containerID="cri-o://fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25" gracePeriod=10 Apr 22 18:54:18.147256 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.147233 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:18.219748 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.219711 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-extensions-socket-volume\") pod \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\" (UID: \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\") " Apr 22 18:54:18.219908 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.219784 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg9jn\" (UniqueName: \"kubernetes.io/projected/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-kube-api-access-pg9jn\") pod \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\" (UID: \"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6\") " Apr 22 18:54:18.220071 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.220036 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "99e0d1d8-2bd3-4130-9ca5-a06e17c697b6" (UID: "99e0d1d8-2bd3-4130-9ca5-a06e17c697b6"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:54:18.221804 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.221780 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-kube-api-access-pg9jn" (OuterVolumeSpecName: "kube-api-access-pg9jn") pod "99e0d1d8-2bd3-4130-9ca5-a06e17c697b6" (UID: "99e0d1d8-2bd3-4130-9ca5-a06e17c697b6"). InnerVolumeSpecName "kube-api-access-pg9jn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:18.320597 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.320566 2568 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-extensions-socket-volume\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:54:18.320597 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.320593 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pg9jn\" (UniqueName: \"kubernetes.io/projected/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6-kube-api-access-pg9jn\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:54:18.743939 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.743846 2568 generic.go:358] "Generic (PLEG): container finished" podID="99e0d1d8-2bd3-4130-9ca5-a06e17c697b6" containerID="fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25" exitCode=0 Apr 22 18:54:18.743939 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.743907 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" Apr 22 18:54:18.743939 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.743927 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" event={"ID":"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6","Type":"ContainerDied","Data":"fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25"} Apr 22 18:54:18.744191 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.743968 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b" event={"ID":"99e0d1d8-2bd3-4130-9ca5-a06e17c697b6","Type":"ContainerDied","Data":"7c50925230a2546851e7b0cfd9f51300e562c3c1df81a91cf61d73a1c4c82db2"} Apr 22 18:54:18.744191 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.743987 2568 scope.go:117] "RemoveContainer" containerID="fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25" Apr 22 18:54:18.752957 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.752938 2568 scope.go:117] "RemoveContainer" containerID="fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25" Apr 22 18:54:18.753194 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:54:18.753175 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25\": container with ID starting with fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25 not found: ID does not exist" containerID="fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25" Apr 22 18:54:18.753241 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.753203 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25"} err="failed to get container status \"fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25\": rpc error: code = NotFound desc = could not find container \"fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25\": container with ID starting with fb23246b7c84d982bed254a38cafd4f0748eb45eb2a76a24fef1aba891e7ee25 not found: ID does not exist" Apr 22 18:54:18.767814 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.767787 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b"] Apr 22 18:54:18.776660 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:18.776640 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d728b"] Apr 22 18:54:19.307096 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:19.307056 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e0d1d8-2bd3-4130-9ca5-a06e17c697b6" path="/var/lib/kubelet/pods/99e0d1d8-2bd3-4130-9ca5-a06e17c697b6/volumes" Apr 22 18:54:33.637351 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.637318 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77"] Apr 22 18:54:33.637862 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.637689 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99e0d1d8-2bd3-4130-9ca5-a06e17c697b6" containerName="manager" Apr 22 18:54:33.637862 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.637700 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e0d1d8-2bd3-4130-9ca5-a06e17c697b6" containerName="manager" Apr 22 18:54:33.637862 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.637776 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="99e0d1d8-2bd3-4130-9ca5-a06e17c697b6" containerName="manager" Apr 22 18:54:33.653012 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.652980 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.653173 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.653022 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77"] Apr 22 18:54:33.655621 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.655596 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-vnrwq\"" Apr 22 18:54:33.752154 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.752127 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.752311 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.752184 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.752311 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.752210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.752311 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.752236 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.752311 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.752254 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qjgc\" (UniqueName: \"kubernetes.io/projected/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-kube-api-access-2qjgc\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.752311 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.752282 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.752475 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.752314 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.752475 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.752368 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.752475 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.752400 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853266 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853227 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853529 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853284 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853529 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853529 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853368 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853529 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853422 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853529 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853446 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853529 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853871 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853603 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qjgc\" (UniqueName: \"kubernetes.io/projected/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-kube-api-access-2qjgc\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853871 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853656 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.853972 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.853950 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.854027 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.854001 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.854027 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.854017 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.854129 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.854001 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.854129 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.854120 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.855825 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.855797 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.856068 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.856050 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.862055 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.862032 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.862140 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.862090 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qjgc\" (UniqueName: \"kubernetes.io/projected/cbb27c2e-e776-4b45-9d72-a8e02e0da32a-kube-api-access-2qjgc\") pod \"maas-default-gateway-openshift-default-845c6b4b48-rhh77\" (UID: \"cbb27c2e-e776-4b45-9d72-a8e02e0da32a\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:33.965406 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:33.965326 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:34.295736 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:34.295702 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77"] Apr 22 18:54:34.296995 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:54:34.296967 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb27c2e_e776_4b45_9d72_a8e02e0da32a.slice/crio-6e3830733d76f5fbf872ee96812c77fc7186b5c89bf00c2354fa20d42dd74838 WatchSource:0}: Error finding container 6e3830733d76f5fbf872ee96812c77fc7186b5c89bf00c2354fa20d42dd74838: Status 404 returned error can't find the container with id 6e3830733d76f5fbf872ee96812c77fc7186b5c89bf00c2354fa20d42dd74838 Apr 22 18:54:34.299011 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:34.298976 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 18:54:34.299133 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:34.299052 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 18:54:34.299133 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:34.299095 2568 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 22 18:54:34.809368 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:34.809329 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" event={"ID":"cbb27c2e-e776-4b45-9d72-a8e02e0da32a","Type":"ContainerStarted","Data":"3bc7f813b86bbbd54981c8184ce2233ae4302d15d74ae3bd9c54dd646edb8cd3"} Apr 22 18:54:34.809368 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:34.809372 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" event={"ID":"cbb27c2e-e776-4b45-9d72-a8e02e0da32a","Type":"ContainerStarted","Data":"6e3830733d76f5fbf872ee96812c77fc7186b5c89bf00c2354fa20d42dd74838"} Apr 22 18:54:34.829574 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:34.829519 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" podStartSLOduration=1.829480379 podStartE2EDuration="1.829480379s" podCreationTimestamp="2026-04-22 18:54:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:54:34.827746859 +0000 UTC m=+738.115183047" watchObservedRunningTime="2026-04-22 18:54:34.829480379 +0000 UTC m=+738.116916597" Apr 22 18:54:34.965843 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:34.965808 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:35.970552 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:35.970511 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:36.817058 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:36.817026 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:36.818155 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:36.818134 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-rhh77" Apr 22 18:54:38.309053 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.309022 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-lhh9j"] Apr 22 18:54:38.392290 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.392247 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-lhh9j"] Apr 22 18:54:38.392452 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.392385 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:38.394816 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.394786 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 18:54:38.403915 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.403892 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-lhh9j"] Apr 22 18:54:38.497588 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.497554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2354a6b8-112a-4ccc-9497-620d23b1008a-config-file\") pod \"limitador-limitador-7d549b5b-lhh9j\" (UID: \"2354a6b8-112a-4ccc-9497-620d23b1008a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:38.497745 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.497616 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4xph\" (UniqueName: \"kubernetes.io/projected/2354a6b8-112a-4ccc-9497-620d23b1008a-kube-api-access-p4xph\") pod \"limitador-limitador-7d549b5b-lhh9j\" (UID: \"2354a6b8-112a-4ccc-9497-620d23b1008a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:38.598727 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.598642 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2354a6b8-112a-4ccc-9497-620d23b1008a-config-file\") pod \"limitador-limitador-7d549b5b-lhh9j\" (UID: \"2354a6b8-112a-4ccc-9497-620d23b1008a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:38.598727 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.598700 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4xph\" (UniqueName: \"kubernetes.io/projected/2354a6b8-112a-4ccc-9497-620d23b1008a-kube-api-access-p4xph\") pod \"limitador-limitador-7d549b5b-lhh9j\" (UID: \"2354a6b8-112a-4ccc-9497-620d23b1008a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:38.599245 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.599225 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2354a6b8-112a-4ccc-9497-620d23b1008a-config-file\") pod \"limitador-limitador-7d549b5b-lhh9j\" (UID: \"2354a6b8-112a-4ccc-9497-620d23b1008a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:38.606749 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.606726 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4xph\" (UniqueName: \"kubernetes.io/projected/2354a6b8-112a-4ccc-9497-620d23b1008a-kube-api-access-p4xph\") pod \"limitador-limitador-7d549b5b-lhh9j\" (UID: \"2354a6b8-112a-4ccc-9497-620d23b1008a\") " pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:38.706146 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.706110 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:38.853645 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:38.853622 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-lhh9j"] Apr 22 18:54:38.855089 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:54:38.855061 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2354a6b8_112a_4ccc_9497_620d23b1008a.slice/crio-217b5f8ea8a031440f085d83518a64eeac636015566468cf55810244889665e9 WatchSource:0}: Error finding container 217b5f8ea8a031440f085d83518a64eeac636015566468cf55810244889665e9: Status 404 returned error can't find the container with id 217b5f8ea8a031440f085d83518a64eeac636015566468cf55810244889665e9 Apr 22 18:54:39.351302 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.351269 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-ztmgb"] Apr 22 18:54:39.363197 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.363172 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ztmgb" Apr 22 18:54:39.366262 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.366235 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-jshl5\"" Apr 22 18:54:39.378180 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.378151 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-ztmgb"] Apr 22 18:54:39.404978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.404947 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2qhk\" (UniqueName: \"kubernetes.io/projected/b5aa9124-c07d-4f8b-966f-cbc36c500e90-kube-api-access-q2qhk\") pod \"authorino-7498df8756-ztmgb\" (UID: \"b5aa9124-c07d-4f8b-966f-cbc36c500e90\") " pod="kuadrant-system/authorino-7498df8756-ztmgb" Apr 22 18:54:39.506702 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.506670 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2qhk\" (UniqueName: \"kubernetes.io/projected/b5aa9124-c07d-4f8b-966f-cbc36c500e90-kube-api-access-q2qhk\") pod \"authorino-7498df8756-ztmgb\" (UID: \"b5aa9124-c07d-4f8b-966f-cbc36c500e90\") " pod="kuadrant-system/authorino-7498df8756-ztmgb" Apr 22 18:54:39.515850 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.515817 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2qhk\" (UniqueName: \"kubernetes.io/projected/b5aa9124-c07d-4f8b-966f-cbc36c500e90-kube-api-access-q2qhk\") pod \"authorino-7498df8756-ztmgb\" (UID: \"b5aa9124-c07d-4f8b-966f-cbc36c500e90\") " pod="kuadrant-system/authorino-7498df8756-ztmgb" Apr 22 18:54:39.675703 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.675620 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ztmgb" Apr 22 18:54:39.814984 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.814952 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-ztmgb"] Apr 22 18:54:39.816471 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:54:39.816439 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5aa9124_c07d_4f8b_966f_cbc36c500e90.slice/crio-ae990fc76fca59053a6023f349f1398665a79913e6224ba5a271e11228c73b56 WatchSource:0}: Error finding container ae990fc76fca59053a6023f349f1398665a79913e6224ba5a271e11228c73b56: Status 404 returned error can't find the container with id ae990fc76fca59053a6023f349f1398665a79913e6224ba5a271e11228c73b56 Apr 22 18:54:39.834211 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.834173 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ztmgb" event={"ID":"b5aa9124-c07d-4f8b-966f-cbc36c500e90","Type":"ContainerStarted","Data":"ae990fc76fca59053a6023f349f1398665a79913e6224ba5a271e11228c73b56"} Apr 22 18:54:39.835824 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:39.835793 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" event={"ID":"2354a6b8-112a-4ccc-9497-620d23b1008a","Type":"ContainerStarted","Data":"217b5f8ea8a031440f085d83518a64eeac636015566468cf55810244889665e9"} Apr 22 18:54:42.853189 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:42.853151 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" event={"ID":"2354a6b8-112a-4ccc-9497-620d23b1008a","Type":"ContainerStarted","Data":"367fdd4502e7986c3552430d9916fe51e0fb45eb89cbc5ab8c0193d4a19368ba"} Apr 22 18:54:42.853657 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:42.853288 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:42.873038 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:42.872979 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" podStartSLOduration=1.4181259480000001 podStartE2EDuration="4.872964499s" podCreationTimestamp="2026-04-22 18:54:38 +0000 UTC" firstStartedPulling="2026-04-22 18:54:38.856900576 +0000 UTC m=+742.144336737" lastFinishedPulling="2026-04-22 18:54:42.311739128 +0000 UTC m=+745.599175288" observedRunningTime="2026-04-22 18:54:42.869490735 +0000 UTC m=+746.156926972" watchObservedRunningTime="2026-04-22 18:54:42.872964499 +0000 UTC m=+746.160400702" Apr 22 18:54:45.867703 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:45.867668 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ztmgb" event={"ID":"b5aa9124-c07d-4f8b-966f-cbc36c500e90","Type":"ContainerStarted","Data":"0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b"} Apr 22 18:54:45.885135 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:45.885083 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-ztmgb" podStartSLOduration=1.308084278 podStartE2EDuration="6.885068592s" podCreationTimestamp="2026-04-22 18:54:39 +0000 UTC" firstStartedPulling="2026-04-22 18:54:39.818463122 +0000 UTC m=+743.105899278" lastFinishedPulling="2026-04-22 18:54:45.395447436 +0000 UTC m=+748.682883592" observedRunningTime="2026-04-22 18:54:45.8831083 +0000 UTC m=+749.170544479" watchObservedRunningTime="2026-04-22 18:54:45.885068592 +0000 UTC m=+749.172504770" Apr 22 18:54:53.506443 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:53.506399 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-lhh9j"] Apr 22 18:54:53.506874 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:53.506683 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" podUID="2354a6b8-112a-4ccc-9497-620d23b1008a" containerName="limitador" containerID="cri-o://367fdd4502e7986c3552430d9916fe51e0fb45eb89cbc5ab8c0193d4a19368ba" gracePeriod=30 Apr 22 18:54:53.507363 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:53.507284 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:53.901384 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:53.901348 2568 generic.go:358] "Generic (PLEG): container finished" podID="2354a6b8-112a-4ccc-9497-620d23b1008a" containerID="367fdd4502e7986c3552430d9916fe51e0fb45eb89cbc5ab8c0193d4a19368ba" exitCode=0 Apr 22 18:54:53.901566 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:53.901394 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" event={"ID":"2354a6b8-112a-4ccc-9497-620d23b1008a","Type":"ContainerDied","Data":"367fdd4502e7986c3552430d9916fe51e0fb45eb89cbc5ab8c0193d4a19368ba"} Apr 22 18:54:54.446186 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.446162 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:54.549894 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.549845 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2354a6b8-112a-4ccc-9497-620d23b1008a-config-file\") pod \"2354a6b8-112a-4ccc-9497-620d23b1008a\" (UID: \"2354a6b8-112a-4ccc-9497-620d23b1008a\") " Apr 22 18:54:54.549894 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.549901 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4xph\" (UniqueName: \"kubernetes.io/projected/2354a6b8-112a-4ccc-9497-620d23b1008a-kube-api-access-p4xph\") pod \"2354a6b8-112a-4ccc-9497-620d23b1008a\" (UID: \"2354a6b8-112a-4ccc-9497-620d23b1008a\") " Apr 22 18:54:54.550321 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.550293 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2354a6b8-112a-4ccc-9497-620d23b1008a-config-file" (OuterVolumeSpecName: "config-file") pod "2354a6b8-112a-4ccc-9497-620d23b1008a" (UID: "2354a6b8-112a-4ccc-9497-620d23b1008a"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:54:54.552042 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.552022 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2354a6b8-112a-4ccc-9497-620d23b1008a-kube-api-access-p4xph" (OuterVolumeSpecName: "kube-api-access-p4xph") pod "2354a6b8-112a-4ccc-9497-620d23b1008a" (UID: "2354a6b8-112a-4ccc-9497-620d23b1008a"). InnerVolumeSpecName "kube-api-access-p4xph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:54:54.651473 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.651387 2568 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/2354a6b8-112a-4ccc-9497-620d23b1008a-config-file\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:54:54.651473 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.651416 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4xph\" (UniqueName: \"kubernetes.io/projected/2354a6b8-112a-4ccc-9497-620d23b1008a-kube-api-access-p4xph\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:54:54.907355 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.907277 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" Apr 22 18:54:54.907524 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.907280 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-lhh9j" event={"ID":"2354a6b8-112a-4ccc-9497-620d23b1008a","Type":"ContainerDied","Data":"217b5f8ea8a031440f085d83518a64eeac636015566468cf55810244889665e9"} Apr 22 18:54:54.907524 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.907385 2568 scope.go:117] "RemoveContainer" containerID="367fdd4502e7986c3552430d9916fe51e0fb45eb89cbc5ab8c0193d4a19368ba" Apr 22 18:54:54.928573 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.928538 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-lhh9j"] Apr 22 18:54:54.932067 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:54.932038 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-lhh9j"] Apr 22 18:54:55.306633 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:55.306599 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2354a6b8-112a-4ccc-9497-620d23b1008a" path="/var/lib/kubelet/pods/2354a6b8-112a-4ccc-9497-620d23b1008a/volumes" Apr 22 18:54:59.426282 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.426251 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-qfcrx"] Apr 22 18:54:59.426878 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.426644 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2354a6b8-112a-4ccc-9497-620d23b1008a" containerName="limitador" Apr 22 18:54:59.426878 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.426656 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="2354a6b8-112a-4ccc-9497-620d23b1008a" containerName="limitador" Apr 22 18:54:59.426878 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.426736 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="2354a6b8-112a-4ccc-9497-620d23b1008a" containerName="limitador" Apr 22 18:54:59.429456 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.429436 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:54:59.431993 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.431971 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 22 18:54:59.432101 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.432006 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-fxwdz\"" Apr 22 18:54:59.440285 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.440265 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-qfcrx"] Apr 22 18:54:59.597914 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.597877 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd-data\") pod \"postgres-868db5846d-qfcrx\" (UID: \"fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd\") " pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:54:59.598077 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.597939 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pgwc\" (UniqueName: \"kubernetes.io/projected/fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd-kube-api-access-5pgwc\") pod \"postgres-868db5846d-qfcrx\" (UID: \"fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd\") " pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:54:59.698923 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.698892 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5pgwc\" (UniqueName: \"kubernetes.io/projected/fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd-kube-api-access-5pgwc\") pod \"postgres-868db5846d-qfcrx\" (UID: \"fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd\") " pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:54:59.699068 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.699026 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd-data\") pod \"postgres-868db5846d-qfcrx\" (UID: \"fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd\") " pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:54:59.699417 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.699397 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd-data\") pod \"postgres-868db5846d-qfcrx\" (UID: \"fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd\") " pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:54:59.706388 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.706360 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pgwc\" (UniqueName: \"kubernetes.io/projected/fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd-kube-api-access-5pgwc\") pod \"postgres-868db5846d-qfcrx\" (UID: \"fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd\") " pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:54:59.740829 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:54:59.740803 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:55:00.066789 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:00.066764 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-qfcrx"] Apr 22 18:55:00.068334 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:55:00.068303 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa4fc7d7_4ce3_456a_adbb_6086dac4b4fd.slice/crio-099c6405e1e67f70fb1360ffbd433f9705fba4d66a8a76c8f308184e1909a086 WatchSource:0}: Error finding container 099c6405e1e67f70fb1360ffbd433f9705fba4d66a8a76c8f308184e1909a086: Status 404 returned error can't find the container with id 099c6405e1e67f70fb1360ffbd433f9705fba4d66a8a76c8f308184e1909a086 Apr 22 18:55:00.933401 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:00.933366 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-qfcrx" event={"ID":"fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd","Type":"ContainerStarted","Data":"099c6405e1e67f70fb1360ffbd433f9705fba4d66a8a76c8f308184e1909a086"} Apr 22 18:55:05.956816 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:05.956775 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-qfcrx" event={"ID":"fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd","Type":"ContainerStarted","Data":"9cecbc8b4e2c0385e2aeaf5c50fb3e923c9cfd8f124c4d2a71e346983ffad749"} Apr 22 18:55:05.957253 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:05.956868 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:55:05.973351 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:05.973298 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-qfcrx" podStartSLOduration=1.778065601 podStartE2EDuration="6.973284276s" podCreationTimestamp="2026-04-22 18:54:59 +0000 UTC" firstStartedPulling="2026-04-22 18:55:00.069719102 +0000 UTC m=+763.357155258" lastFinishedPulling="2026-04-22 18:55:05.264937777 +0000 UTC m=+768.552373933" observedRunningTime="2026-04-22 18:55:05.971433784 +0000 UTC m=+769.258869963" watchObservedRunningTime="2026-04-22 18:55:05.973284276 +0000 UTC m=+769.260720453" Apr 22 18:55:11.989145 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:11.989117 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-qfcrx" Apr 22 18:55:13.029959 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.029920 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5f5b6f4f8d-hslqn"] Apr 22 18:55:13.035361 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.035339 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:55:13.037880 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.037858 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 18:55:13.039294 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.039273 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5f5b6f4f8d-hslqn"] Apr 22 18:55:13.130202 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.130161 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6545q\" (UniqueName: \"kubernetes.io/projected/d799c2a4-52c1-4688-93f7-69327f4f9388-kube-api-access-6545q\") pod \"authorino-5f5b6f4f8d-hslqn\" (UID: \"d799c2a4-52c1-4688-93f7-69327f4f9388\") " pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:55:13.130381 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.130222 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d799c2a4-52c1-4688-93f7-69327f4f9388-tls-cert\") pod \"authorino-5f5b6f4f8d-hslqn\" (UID: \"d799c2a4-52c1-4688-93f7-69327f4f9388\") " pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:55:13.231682 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.231643 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6545q\" (UniqueName: \"kubernetes.io/projected/d799c2a4-52c1-4688-93f7-69327f4f9388-kube-api-access-6545q\") pod \"authorino-5f5b6f4f8d-hslqn\" (UID: \"d799c2a4-52c1-4688-93f7-69327f4f9388\") " pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:55:13.231854 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.231697 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d799c2a4-52c1-4688-93f7-69327f4f9388-tls-cert\") pod \"authorino-5f5b6f4f8d-hslqn\" (UID: \"d799c2a4-52c1-4688-93f7-69327f4f9388\") " pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:55:13.234067 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.234048 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d799c2a4-52c1-4688-93f7-69327f4f9388-tls-cert\") pod \"authorino-5f5b6f4f8d-hslqn\" (UID: \"d799c2a4-52c1-4688-93f7-69327f4f9388\") " pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:55:13.241045 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.241021 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6545q\" (UniqueName: \"kubernetes.io/projected/d799c2a4-52c1-4688-93f7-69327f4f9388-kube-api-access-6545q\") pod \"authorino-5f5b6f4f8d-hslqn\" (UID: \"d799c2a4-52c1-4688-93f7-69327f4f9388\") " pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:55:13.346544 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.346493 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:55:13.465009 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.464984 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5f5b6f4f8d-hslqn"] Apr 22 18:55:13.466930 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:55:13.466906 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd799c2a4_52c1_4688_93f7_69327f4f9388.slice/crio-64a324a73dbce6dea782cde83c2ed8c7ac557554dd60a5456bd917874e65d5fd WatchSource:0}: Error finding container 64a324a73dbce6dea782cde83c2ed8c7ac557554dd60a5456bd917874e65d5fd: Status 404 returned error can't find the container with id 64a324a73dbce6dea782cde83c2ed8c7ac557554dd60a5456bd917874e65d5fd Apr 22 18:55:13.990758 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.990723 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" event={"ID":"d799c2a4-52c1-4688-93f7-69327f4f9388","Type":"ContainerStarted","Data":"e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308"} Apr 22 18:55:13.990924 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:13.990767 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" event={"ID":"d799c2a4-52c1-4688-93f7-69327f4f9388","Type":"ContainerStarted","Data":"64a324a73dbce6dea782cde83c2ed8c7ac557554dd60a5456bd917874e65d5fd"} Apr 22 18:55:14.021830 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.021767 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" podStartSLOduration=0.66352159 podStartE2EDuration="1.021750321s" podCreationTimestamp="2026-04-22 18:55:13 +0000 UTC" firstStartedPulling="2026-04-22 18:55:13.468185546 +0000 UTC m=+776.755621702" lastFinishedPulling="2026-04-22 18:55:13.826414274 +0000 UTC m=+777.113850433" observedRunningTime="2026-04-22 18:55:14.020560994 +0000 UTC m=+777.307997175" watchObservedRunningTime="2026-04-22 18:55:14.021750321 +0000 UTC m=+777.309186498" Apr 22 18:55:14.056319 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.056281 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-ztmgb"] Apr 22 18:55:14.056752 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.056469 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-ztmgb" podUID="b5aa9124-c07d-4f8b-966f-cbc36c500e90" containerName="authorino" containerID="cri-o://0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b" gracePeriod=30 Apr 22 18:55:14.293304 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.293279 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ztmgb" Apr 22 18:55:14.342111 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.342078 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2qhk\" (UniqueName: \"kubernetes.io/projected/b5aa9124-c07d-4f8b-966f-cbc36c500e90-kube-api-access-q2qhk\") pod \"b5aa9124-c07d-4f8b-966f-cbc36c500e90\" (UID: \"b5aa9124-c07d-4f8b-966f-cbc36c500e90\") " Apr 22 18:55:14.344067 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.344042 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5aa9124-c07d-4f8b-966f-cbc36c500e90-kube-api-access-q2qhk" (OuterVolumeSpecName: "kube-api-access-q2qhk") pod "b5aa9124-c07d-4f8b-966f-cbc36c500e90" (UID: "b5aa9124-c07d-4f8b-966f-cbc36c500e90"). InnerVolumeSpecName "kube-api-access-q2qhk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:14.443785 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.443708 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q2qhk\" (UniqueName: \"kubernetes.io/projected/b5aa9124-c07d-4f8b-966f-cbc36c500e90-kube-api-access-q2qhk\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:55:14.832818 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.832788 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cb27s"] Apr 22 18:55:14.833184 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.833171 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5aa9124-c07d-4f8b-966f-cbc36c500e90" containerName="authorino" Apr 22 18:55:14.833241 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.833185 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5aa9124-c07d-4f8b-966f-cbc36c500e90" containerName="authorino" Apr 22 18:55:14.833286 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.833243 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5aa9124-c07d-4f8b-966f-cbc36c500e90" containerName="authorino" Apr 22 18:55:14.835595 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.835574 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" Apr 22 18:55:14.839385 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.839368 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-jh8rj\"" Apr 22 18:55:14.848184 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.848164 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cb27s"] Apr 22 18:55:14.947312 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.947277 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc4t8\" (UniqueName: \"kubernetes.io/projected/1ee928ff-08a3-4e74-9dd3-04d00096d823-kube-api-access-gc4t8\") pod \"maas-controller-6d4c8f55f9-cb27s\" (UID: \"1ee928ff-08a3-4e74-9dd3-04d00096d823\") " pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" Apr 22 18:55:14.970486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.970457 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-68cddf8959-rpkqj"] Apr 22 18:55:14.973092 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.973078 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68cddf8959-rpkqj" Apr 22 18:55:14.985051 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:14.985026 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-68cddf8959-rpkqj"] Apr 22 18:55:15.002450 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.002405 2568 generic.go:358] "Generic (PLEG): container finished" podID="b5aa9124-c07d-4f8b-966f-cbc36c500e90" containerID="0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b" exitCode=0 Apr 22 18:55:15.002630 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.002549 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ztmgb" event={"ID":"b5aa9124-c07d-4f8b-966f-cbc36c500e90","Type":"ContainerDied","Data":"0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b"} Apr 22 18:55:15.002630 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.002589 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ztmgb" event={"ID":"b5aa9124-c07d-4f8b-966f-cbc36c500e90","Type":"ContainerDied","Data":"ae990fc76fca59053a6023f349f1398665a79913e6224ba5a271e11228c73b56"} Apr 22 18:55:15.002630 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.002602 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ztmgb" Apr 22 18:55:15.002630 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.002616 2568 scope.go:117] "RemoveContainer" containerID="0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b" Apr 22 18:55:15.013095 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.013073 2568 scope.go:117] "RemoveContainer" containerID="0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b" Apr 22 18:55:15.013393 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:55:15.013372 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b\": container with ID starting with 0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b not found: ID does not exist" containerID="0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b" Apr 22 18:55:15.013459 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.013401 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b"} err="failed to get container status \"0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b\": rpc error: code = NotFound desc = could not find container \"0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b\": container with ID starting with 0351610ca75193cd8e8ba6ecc39df744f0facceac4c5c4e0cb9aa41aeae9f23b not found: ID does not exist" Apr 22 18:55:15.025932 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.025903 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-ztmgb"] Apr 22 18:55:15.029689 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.029664 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-ztmgb"] Apr 22 18:55:15.048570 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.048536 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gc4t8\" (UniqueName: \"kubernetes.io/projected/1ee928ff-08a3-4e74-9dd3-04d00096d823-kube-api-access-gc4t8\") pod \"maas-controller-6d4c8f55f9-cb27s\" (UID: \"1ee928ff-08a3-4e74-9dd3-04d00096d823\") " pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" Apr 22 18:55:15.048691 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.048616 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w74f\" (UniqueName: \"kubernetes.io/projected/5f54371d-e00f-40af-9e1e-af1108b6bc1e-kube-api-access-6w74f\") pod \"maas-controller-68cddf8959-rpkqj\" (UID: \"5f54371d-e00f-40af-9e1e-af1108b6bc1e\") " pod="opendatahub/maas-controller-68cddf8959-rpkqj" Apr 22 18:55:15.056467 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.056439 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc4t8\" (UniqueName: \"kubernetes.io/projected/1ee928ff-08a3-4e74-9dd3-04d00096d823-kube-api-access-gc4t8\") pod \"maas-controller-6d4c8f55f9-cb27s\" (UID: \"1ee928ff-08a3-4e74-9dd3-04d00096d823\") " pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" Apr 22 18:55:15.097804 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.097711 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-68cddf8959-rpkqj"] Apr 22 18:55:15.098014 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:55:15.097995 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-6w74f], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-68cddf8959-rpkqj" podUID="5f54371d-e00f-40af-9e1e-af1108b6bc1e" Apr 22 18:55:15.146372 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.146346 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" Apr 22 18:55:15.149405 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.149379 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6w74f\" (UniqueName: \"kubernetes.io/projected/5f54371d-e00f-40af-9e1e-af1108b6bc1e-kube-api-access-6w74f\") pod \"maas-controller-68cddf8959-rpkqj\" (UID: \"5f54371d-e00f-40af-9e1e-af1108b6bc1e\") " pod="opendatahub/maas-controller-68cddf8959-rpkqj" Apr 22 18:55:15.157486 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.157468 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w74f\" (UniqueName: \"kubernetes.io/projected/5f54371d-e00f-40af-9e1e-af1108b6bc1e-kube-api-access-6w74f\") pod \"maas-controller-68cddf8959-rpkqj\" (UID: \"5f54371d-e00f-40af-9e1e-af1108b6bc1e\") " pod="opendatahub/maas-controller-68cddf8959-rpkqj" Apr 22 18:55:15.271475 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:55:15.271443 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ee928ff_08a3_4e74_9dd3_04d00096d823.slice/crio-741d947f37b3b58affbf568f14dbcaf19f5b6e5f81eb9868ce8beaed7f17b655 WatchSource:0}: Error finding container 741d947f37b3b58affbf568f14dbcaf19f5b6e5f81eb9868ce8beaed7f17b655: Status 404 returned error can't find the container with id 741d947f37b3b58affbf568f14dbcaf19f5b6e5f81eb9868ce8beaed7f17b655 Apr 22 18:55:15.271607 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.271466 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cb27s"] Apr 22 18:55:15.307784 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:15.307753 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5aa9124-c07d-4f8b-966f-cbc36c500e90" path="/var/lib/kubelet/pods/b5aa9124-c07d-4f8b-966f-cbc36c500e90/volumes" Apr 22 18:55:16.009837 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:16.009793 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" event={"ID":"1ee928ff-08a3-4e74-9dd3-04d00096d823","Type":"ContainerStarted","Data":"741d947f37b3b58affbf568f14dbcaf19f5b6e5f81eb9868ce8beaed7f17b655"} Apr 22 18:55:16.009837 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:16.009825 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68cddf8959-rpkqj" Apr 22 18:55:16.016584 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:16.016552 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68cddf8959-rpkqj" Apr 22 18:55:16.058831 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:16.058694 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w74f\" (UniqueName: \"kubernetes.io/projected/5f54371d-e00f-40af-9e1e-af1108b6bc1e-kube-api-access-6w74f\") pod \"5f54371d-e00f-40af-9e1e-af1108b6bc1e\" (UID: \"5f54371d-e00f-40af-9e1e-af1108b6bc1e\") " Apr 22 18:55:16.062120 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:16.062076 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f54371d-e00f-40af-9e1e-af1108b6bc1e-kube-api-access-6w74f" (OuterVolumeSpecName: "kube-api-access-6w74f") pod "5f54371d-e00f-40af-9e1e-af1108b6bc1e" (UID: "5f54371d-e00f-40af-9e1e-af1108b6bc1e"). InnerVolumeSpecName "kube-api-access-6w74f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:16.160146 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:16.160098 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6w74f\" (UniqueName: \"kubernetes.io/projected/5f54371d-e00f-40af-9e1e-af1108b6bc1e-kube-api-access-6w74f\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:55:17.017441 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:17.017316 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-68cddf8959-rpkqj" Apr 22 18:55:17.051452 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:17.051420 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-68cddf8959-rpkqj"] Apr 22 18:55:17.056622 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:17.056595 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-68cddf8959-rpkqj"] Apr 22 18:55:17.308704 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:17.308674 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f54371d-e00f-40af-9e1e-af1108b6bc1e" path="/var/lib/kubelet/pods/5f54371d-e00f-40af-9e1e-af1108b6bc1e/volumes" Apr 22 18:55:18.023114 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:18.023076 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" event={"ID":"1ee928ff-08a3-4e74-9dd3-04d00096d823","Type":"ContainerStarted","Data":"c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e"} Apr 22 18:55:18.023312 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:18.023203 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" Apr 22 18:55:18.039891 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:18.039845 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" podStartSLOduration=1.820149301 podStartE2EDuration="4.039832506s" podCreationTimestamp="2026-04-22 18:55:14 +0000 UTC" firstStartedPulling="2026-04-22 18:55:15.272772191 +0000 UTC m=+778.560208349" lastFinishedPulling="2026-04-22 18:55:17.492455398 +0000 UTC m=+780.779891554" observedRunningTime="2026-04-22 18:55:18.038344507 +0000 UTC m=+781.325780685" watchObservedRunningTime="2026-04-22 18:55:18.039832506 +0000 UTC m=+781.327268684" Apr 22 18:55:19.713838 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.713806 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-9d8d9b57-97jlb"] Apr 22 18:55:19.719038 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.719018 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:19.721594 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.721573 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 18:55:19.721676 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.721652 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 18:55:19.721807 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.721793 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-4fsbs\"" Apr 22 18:55:19.727649 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.727628 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-9d8d9b57-97jlb"] Apr 22 18:55:19.799404 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.799364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a666449-9e35-400b-9a99-baf40bbbc4c7-maas-api-tls\") pod \"maas-api-9d8d9b57-97jlb\" (UID: \"7a666449-9e35-400b-9a99-baf40bbbc4c7\") " pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:19.799602 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.799468 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd5xh\" (UniqueName: \"kubernetes.io/projected/7a666449-9e35-400b-9a99-baf40bbbc4c7-kube-api-access-sd5xh\") pod \"maas-api-9d8d9b57-97jlb\" (UID: \"7a666449-9e35-400b-9a99-baf40bbbc4c7\") " pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:19.900434 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.900401 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a666449-9e35-400b-9a99-baf40bbbc4c7-maas-api-tls\") pod \"maas-api-9d8d9b57-97jlb\" (UID: \"7a666449-9e35-400b-9a99-baf40bbbc4c7\") " pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:19.900629 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.900530 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd5xh\" (UniqueName: \"kubernetes.io/projected/7a666449-9e35-400b-9a99-baf40bbbc4c7-kube-api-access-sd5xh\") pod \"maas-api-9d8d9b57-97jlb\" (UID: \"7a666449-9e35-400b-9a99-baf40bbbc4c7\") " pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:19.902825 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.902795 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a666449-9e35-400b-9a99-baf40bbbc4c7-maas-api-tls\") pod \"maas-api-9d8d9b57-97jlb\" (UID: \"7a666449-9e35-400b-9a99-baf40bbbc4c7\") " pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:19.911243 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:19.911216 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd5xh\" (UniqueName: \"kubernetes.io/projected/7a666449-9e35-400b-9a99-baf40bbbc4c7-kube-api-access-sd5xh\") pod \"maas-api-9d8d9b57-97jlb\" (UID: \"7a666449-9e35-400b-9a99-baf40bbbc4c7\") " pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:20.031278 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:20.031186 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:20.175803 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:20.175776 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-9d8d9b57-97jlb"] Apr 22 18:55:20.177213 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:55:20.177183 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a666449_9e35_400b_9a99_baf40bbbc4c7.slice/crio-aef4ca375b3b254aa819e3bb94460e9a87e431d2afd0957cf45d6b4f57733344 WatchSource:0}: Error finding container aef4ca375b3b254aa819e3bb94460e9a87e431d2afd0957cf45d6b4f57733344: Status 404 returned error can't find the container with id aef4ca375b3b254aa819e3bb94460e9a87e431d2afd0957cf45d6b4f57733344 Apr 22 18:55:21.039101 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:21.039059 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-9d8d9b57-97jlb" event={"ID":"7a666449-9e35-400b-9a99-baf40bbbc4c7","Type":"ContainerStarted","Data":"aef4ca375b3b254aa819e3bb94460e9a87e431d2afd0957cf45d6b4f57733344"} Apr 22 18:55:22.044876 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:22.044842 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-9d8d9b57-97jlb" event={"ID":"7a666449-9e35-400b-9a99-baf40bbbc4c7","Type":"ContainerStarted","Data":"da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d"} Apr 22 18:55:22.045260 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:22.044940 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:22.063514 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:22.063455 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-9d8d9b57-97jlb" podStartSLOduration=1.7309733349999998 podStartE2EDuration="3.063442875s" podCreationTimestamp="2026-04-22 18:55:19 +0000 UTC" firstStartedPulling="2026-04-22 18:55:20.178788036 +0000 UTC m=+783.466224191" lastFinishedPulling="2026-04-22 18:55:21.511257562 +0000 UTC m=+784.798693731" observedRunningTime="2026-04-22 18:55:22.062778757 +0000 UTC m=+785.350214982" watchObservedRunningTime="2026-04-22 18:55:22.063442875 +0000 UTC m=+785.350879052" Apr 22 18:55:28.055353 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:28.055326 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:55:29.035106 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.035073 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" Apr 22 18:55:29.543754 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.543718 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cb27s"] Apr 22 18:55:29.544210 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.543985 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" podUID="1ee928ff-08a3-4e74-9dd3-04d00096d823" containerName="manager" containerID="cri-o://c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e" gracePeriod=10 Apr 22 18:55:29.777784 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.777761 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" Apr 22 18:55:29.840769 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.840737 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-8497966894-4gbwd"] Apr 22 18:55:29.841171 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.841156 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1ee928ff-08a3-4e74-9dd3-04d00096d823" containerName="manager" Apr 22 18:55:29.841228 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.841173 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee928ff-08a3-4e74-9dd3-04d00096d823" containerName="manager" Apr 22 18:55:29.841264 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.841249 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="1ee928ff-08a3-4e74-9dd3-04d00096d823" containerName="manager" Apr 22 18:55:29.843565 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.843549 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8497966894-4gbwd" Apr 22 18:55:29.852013 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.851986 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8497966894-4gbwd"] Apr 22 18:55:29.898065 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.898032 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc4t8\" (UniqueName: \"kubernetes.io/projected/1ee928ff-08a3-4e74-9dd3-04d00096d823-kube-api-access-gc4t8\") pod \"1ee928ff-08a3-4e74-9dd3-04d00096d823\" (UID: \"1ee928ff-08a3-4e74-9dd3-04d00096d823\") " Apr 22 18:55:29.900202 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.900172 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee928ff-08a3-4e74-9dd3-04d00096d823-kube-api-access-gc4t8" (OuterVolumeSpecName: "kube-api-access-gc4t8") pod "1ee928ff-08a3-4e74-9dd3-04d00096d823" (UID: "1ee928ff-08a3-4e74-9dd3-04d00096d823"). InnerVolumeSpecName "kube-api-access-gc4t8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:29.999252 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.999215 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9qq\" (UniqueName: \"kubernetes.io/projected/0b9cecd7-a118-4650-a223-c0686eb09641-kube-api-access-pc9qq\") pod \"maas-controller-8497966894-4gbwd\" (UID: \"0b9cecd7-a118-4650-a223-c0686eb09641\") " pod="opendatahub/maas-controller-8497966894-4gbwd" Apr 22 18:55:29.999420 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:29.999384 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gc4t8\" (UniqueName: \"kubernetes.io/projected/1ee928ff-08a3-4e74-9dd3-04d00096d823-kube-api-access-gc4t8\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:55:30.090306 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.090270 2568 generic.go:358] "Generic (PLEG): container finished" podID="1ee928ff-08a3-4e74-9dd3-04d00096d823" containerID="c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e" exitCode=0 Apr 22 18:55:30.090461 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.090345 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" Apr 22 18:55:30.090461 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.090349 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" event={"ID":"1ee928ff-08a3-4e74-9dd3-04d00096d823","Type":"ContainerDied","Data":"c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e"} Apr 22 18:55:30.090461 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.090386 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-cb27s" event={"ID":"1ee928ff-08a3-4e74-9dd3-04d00096d823","Type":"ContainerDied","Data":"741d947f37b3b58affbf568f14dbcaf19f5b6e5f81eb9868ce8beaed7f17b655"} Apr 22 18:55:30.090461 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.090404 2568 scope.go:117] "RemoveContainer" containerID="c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e" Apr 22 18:55:30.101696 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.101660 2568 scope.go:117] "RemoveContainer" containerID="c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e" Apr 22 18:55:30.101994 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.101974 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9qq\" (UniqueName: \"kubernetes.io/projected/0b9cecd7-a118-4650-a223-c0686eb09641-kube-api-access-pc9qq\") pod \"maas-controller-8497966894-4gbwd\" (UID: \"0b9cecd7-a118-4650-a223-c0686eb09641\") " pod="opendatahub/maas-controller-8497966894-4gbwd" Apr 22 18:55:30.102108 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:55:30.102090 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e\": container with ID starting with c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e not found: ID does not exist" containerID="c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e" Apr 22 18:55:30.102168 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.102116 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e"} err="failed to get container status \"c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e\": rpc error: code = NotFound desc = could not find container \"c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e\": container with ID starting with c48478f061ccb5ff6000266377c1e26f300911c6bb2c9594e5203ded8f0d621e not found: ID does not exist" Apr 22 18:55:30.110127 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.110105 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9qq\" (UniqueName: \"kubernetes.io/projected/0b9cecd7-a118-4650-a223-c0686eb09641-kube-api-access-pc9qq\") pod \"maas-controller-8497966894-4gbwd\" (UID: \"0b9cecd7-a118-4650-a223-c0686eb09641\") " pod="opendatahub/maas-controller-8497966894-4gbwd" Apr 22 18:55:30.113395 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.113368 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cb27s"] Apr 22 18:55:30.117659 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.117636 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-cb27s"] Apr 22 18:55:30.155462 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.155434 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8497966894-4gbwd" Apr 22 18:55:30.283463 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:30.283440 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8497966894-4gbwd"] Apr 22 18:55:30.285230 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:55:30.285196 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b9cecd7_a118_4650_a223_c0686eb09641.slice/crio-77f098c64c73c97d46d81c3616247624d63ebb143c9de4952fe4f6e49c077b5d WatchSource:0}: Error finding container 77f098c64c73c97d46d81c3616247624d63ebb143c9de4952fe4f6e49c077b5d: Status 404 returned error can't find the container with id 77f098c64c73c97d46d81c3616247624d63ebb143c9de4952fe4f6e49c077b5d Apr 22 18:55:31.097576 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:31.097540 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8497966894-4gbwd" event={"ID":"0b9cecd7-a118-4650-a223-c0686eb09641","Type":"ContainerStarted","Data":"77f098c64c73c97d46d81c3616247624d63ebb143c9de4952fe4f6e49c077b5d"} Apr 22 18:55:31.307919 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:31.307883 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee928ff-08a3-4e74-9dd3-04d00096d823" path="/var/lib/kubelet/pods/1ee928ff-08a3-4e74-9dd3-04d00096d823/volumes" Apr 22 18:55:32.102582 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:32.102541 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8497966894-4gbwd" event={"ID":"0b9cecd7-a118-4650-a223-c0686eb09641","Type":"ContainerStarted","Data":"22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c"} Apr 22 18:55:32.103057 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:32.102649 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-8497966894-4gbwd" Apr 22 18:55:32.119980 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:32.119930 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-8497966894-4gbwd" podStartSLOduration=1.6825826529999999 podStartE2EDuration="3.119915964s" podCreationTimestamp="2026-04-22 18:55:29 +0000 UTC" firstStartedPulling="2026-04-22 18:55:30.286610887 +0000 UTC m=+793.574047048" lastFinishedPulling="2026-04-22 18:55:31.723944203 +0000 UTC m=+795.011380359" observedRunningTime="2026-04-22 18:55:32.118805814 +0000 UTC m=+795.406241992" watchObservedRunningTime="2026-04-22 18:55:32.119915964 +0000 UTC m=+795.407352141" Apr 22 18:55:43.112683 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:55:43.112608 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-8497966894-4gbwd" Apr 22 18:56:00.748352 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:00.748313 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-9d8d9b57-97jlb"] Apr 22 18:56:00.748756 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:00.748580 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-9d8d9b57-97jlb" podUID="7a666449-9e35-400b-9a99-baf40bbbc4c7" containerName="maas-api" containerID="cri-o://da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d" gracePeriod=30 Apr 22 18:56:00.998070 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:00.998046 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:56:01.090737 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.090702 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd5xh\" (UniqueName: \"kubernetes.io/projected/7a666449-9e35-400b-9a99-baf40bbbc4c7-kube-api-access-sd5xh\") pod \"7a666449-9e35-400b-9a99-baf40bbbc4c7\" (UID: \"7a666449-9e35-400b-9a99-baf40bbbc4c7\") " Apr 22 18:56:01.090950 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.090795 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a666449-9e35-400b-9a99-baf40bbbc4c7-maas-api-tls\") pod \"7a666449-9e35-400b-9a99-baf40bbbc4c7\" (UID: \"7a666449-9e35-400b-9a99-baf40bbbc4c7\") " Apr 22 18:56:01.093086 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.093054 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a666449-9e35-400b-9a99-baf40bbbc4c7-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "7a666449-9e35-400b-9a99-baf40bbbc4c7" (UID: "7a666449-9e35-400b-9a99-baf40bbbc4c7"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:56:01.093213 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.093058 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a666449-9e35-400b-9a99-baf40bbbc4c7-kube-api-access-sd5xh" (OuterVolumeSpecName: "kube-api-access-sd5xh") pod "7a666449-9e35-400b-9a99-baf40bbbc4c7" (UID: "7a666449-9e35-400b-9a99-baf40bbbc4c7"). InnerVolumeSpecName "kube-api-access-sd5xh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:56:01.191835 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.191793 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sd5xh\" (UniqueName: \"kubernetes.io/projected/7a666449-9e35-400b-9a99-baf40bbbc4c7-kube-api-access-sd5xh\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:56:01.191835 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.191821 2568 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7a666449-9e35-400b-9a99-baf40bbbc4c7-maas-api-tls\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:56:01.215272 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.215233 2568 generic.go:358] "Generic (PLEG): container finished" podID="7a666449-9e35-400b-9a99-baf40bbbc4c7" containerID="da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d" exitCode=0 Apr 22 18:56:01.215471 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.215303 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-9d8d9b57-97jlb" Apr 22 18:56:01.215471 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.215310 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-9d8d9b57-97jlb" event={"ID":"7a666449-9e35-400b-9a99-baf40bbbc4c7","Type":"ContainerDied","Data":"da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d"} Apr 22 18:56:01.215471 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.215350 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-9d8d9b57-97jlb" event={"ID":"7a666449-9e35-400b-9a99-baf40bbbc4c7","Type":"ContainerDied","Data":"aef4ca375b3b254aa819e3bb94460e9a87e431d2afd0957cf45d6b4f57733344"} Apr 22 18:56:01.215471 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.215367 2568 scope.go:117] "RemoveContainer" containerID="da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d" Apr 22 18:56:01.228702 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.228678 2568 scope.go:117] "RemoveContainer" containerID="da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d" Apr 22 18:56:01.229031 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:56:01.228998 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d\": container with ID starting with da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d not found: ID does not exist" containerID="da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d" Apr 22 18:56:01.229137 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.229046 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d"} err="failed to get container status \"da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d\": rpc error: code = NotFound desc = could not find container \"da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d\": container with ID starting with da939acfcbab55c0be5c435fad4776c08fa49c47ebc86d2f0c4b8c8c1958688d not found: ID does not exist" Apr 22 18:56:01.245574 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.245543 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-9d8d9b57-97jlb"] Apr 22 18:56:01.247991 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.247971 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-9d8d9b57-97jlb"] Apr 22 18:56:01.307359 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:01.307321 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a666449-9e35-400b-9a99-baf40bbbc4c7" path="/var/lib/kubelet/pods/7a666449-9e35-400b-9a99-baf40bbbc4c7/volumes" Apr 22 18:56:13.215339 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.215304 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7"] Apr 22 18:56:13.215774 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.215710 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7a666449-9e35-400b-9a99-baf40bbbc4c7" containerName="maas-api" Apr 22 18:56:13.215774 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.215722 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a666449-9e35-400b-9a99-baf40bbbc4c7" containerName="maas-api" Apr 22 18:56:13.215862 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.215796 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="7a666449-9e35-400b-9a99-baf40bbbc4c7" containerName="maas-api" Apr 22 18:56:13.218050 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.218032 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.221944 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.221911 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 18:56:13.222821 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.222802 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 18:56:13.222918 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.222829 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 22 18:56:13.222980 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.222928 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-r6h6w\"" Apr 22 18:56:13.228817 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.228798 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7"] Apr 22 18:56:13.302965 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.302937 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.303152 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.303030 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/243dc079-6420-4b04-962f-61f2a006a05c-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.303152 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.303114 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbkbf\" (UniqueName: \"kubernetes.io/projected/243dc079-6420-4b04-962f-61f2a006a05c-kube-api-access-cbkbf\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.303281 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.303164 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.303281 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.303219 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.303281 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.303265 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.404588 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.404554 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.404753 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.404594 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.404818 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.404747 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.404875 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.404819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.405018 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.404993 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.405100 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.405059 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.405266 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.405248 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/243dc079-6420-4b04-962f-61f2a006a05c-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.405355 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.405320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbkbf\" (UniqueName: \"kubernetes.io/projected/243dc079-6420-4b04-962f-61f2a006a05c-kube-api-access-cbkbf\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.405416 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.405350 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.406985 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.406965 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/243dc079-6420-4b04-962f-61f2a006a05c-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.407589 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.407572 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/243dc079-6420-4b04-962f-61f2a006a05c-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.413433 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.413408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbkbf\" (UniqueName: \"kubernetes.io/projected/243dc079-6420-4b04-962f-61f2a006a05c-kube-api-access-cbkbf\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7\" (UID: \"243dc079-6420-4b04-962f-61f2a006a05c\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.529343 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.529263 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:13.684639 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.684611 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7"] Apr 22 18:56:13.686636 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:56:13.686600 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod243dc079_6420_4b04_962f_61f2a006a05c.slice/crio-3ebe5246461b59fccbf8aec4e557b7601eb5dd8e22055cb5a029aad744f5a149 WatchSource:0}: Error finding container 3ebe5246461b59fccbf8aec4e557b7601eb5dd8e22055cb5a029aad744f5a149: Status 404 returned error can't find the container with id 3ebe5246461b59fccbf8aec4e557b7601eb5dd8e22055cb5a029aad744f5a149 Apr 22 18:56:13.688355 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:13.688336 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:56:14.273657 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:14.273620 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" event={"ID":"243dc079-6420-4b04-962f-61f2a006a05c","Type":"ContainerStarted","Data":"3ebe5246461b59fccbf8aec4e557b7601eb5dd8e22055cb5a029aad744f5a149"} Apr 22 18:56:20.305295 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:20.305243 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" event={"ID":"243dc079-6420-4b04-962f-61f2a006a05c","Type":"ContainerStarted","Data":"61078dac311d787d6dc1af986945cf9c25a64662200ed2053731b12cb3ad694e"} Apr 22 18:56:25.324267 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:25.324231 2568 generic.go:358] "Generic (PLEG): container finished" podID="243dc079-6420-4b04-962f-61f2a006a05c" containerID="61078dac311d787d6dc1af986945cf9c25a64662200ed2053731b12cb3ad694e" exitCode=0 Apr 22 18:56:25.324748 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:25.324330 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" event={"ID":"243dc079-6420-4b04-962f-61f2a006a05c","Type":"ContainerDied","Data":"61078dac311d787d6dc1af986945cf9c25a64662200ed2053731b12cb3ad694e"} Apr 22 18:56:27.334748 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:27.334716 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" event={"ID":"243dc079-6420-4b04-962f-61f2a006a05c","Type":"ContainerStarted","Data":"7175fa0190f95c1ca08db2c0e6b2777d90bae555acaa85024e5b0d0e289a5842"} Apr 22 18:56:27.335145 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:27.334914 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:56:27.354800 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:27.354752 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" podStartSLOduration=1.661500843 podStartE2EDuration="14.354739904s" podCreationTimestamp="2026-04-22 18:56:13 +0000 UTC" firstStartedPulling="2026-04-22 18:56:13.688515064 +0000 UTC m=+836.975951249" lastFinishedPulling="2026-04-22 18:56:26.381754145 +0000 UTC m=+849.669190310" observedRunningTime="2026-04-22 18:56:27.352618302 +0000 UTC m=+850.640054478" watchObservedRunningTime="2026-04-22 18:56:27.354739904 +0000 UTC m=+850.642176081" Apr 22 18:56:38.352347 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:56:38.352309 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7" Apr 22 18:57:44.246210 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.246172 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-67c8dc5c7-flxgg"] Apr 22 18:57:44.249404 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.249386 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67c8dc5c7-flxgg" Apr 22 18:57:44.256164 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.256134 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-67c8dc5c7-flxgg"] Apr 22 18:57:44.363085 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.363035 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c6211f98-24c1-42a2-b4fa-4dd56f164cbe-tls-cert\") pod \"authorino-67c8dc5c7-flxgg\" (UID: \"c6211f98-24c1-42a2-b4fa-4dd56f164cbe\") " pod="kuadrant-system/authorino-67c8dc5c7-flxgg" Apr 22 18:57:44.363085 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.363091 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gbdp\" (UniqueName: \"kubernetes.io/projected/c6211f98-24c1-42a2-b4fa-4dd56f164cbe-kube-api-access-8gbdp\") pod \"authorino-67c8dc5c7-flxgg\" (UID: \"c6211f98-24c1-42a2-b4fa-4dd56f164cbe\") " pod="kuadrant-system/authorino-67c8dc5c7-flxgg" Apr 22 18:57:44.464487 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.464439 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c6211f98-24c1-42a2-b4fa-4dd56f164cbe-tls-cert\") pod \"authorino-67c8dc5c7-flxgg\" (UID: \"c6211f98-24c1-42a2-b4fa-4dd56f164cbe\") " pod="kuadrant-system/authorino-67c8dc5c7-flxgg" Apr 22 18:57:44.464726 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.464606 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8gbdp\" (UniqueName: \"kubernetes.io/projected/c6211f98-24c1-42a2-b4fa-4dd56f164cbe-kube-api-access-8gbdp\") pod \"authorino-67c8dc5c7-flxgg\" (UID: \"c6211f98-24c1-42a2-b4fa-4dd56f164cbe\") " pod="kuadrant-system/authorino-67c8dc5c7-flxgg" Apr 22 18:57:44.467013 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.466987 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c6211f98-24c1-42a2-b4fa-4dd56f164cbe-tls-cert\") pod \"authorino-67c8dc5c7-flxgg\" (UID: \"c6211f98-24c1-42a2-b4fa-4dd56f164cbe\") " pod="kuadrant-system/authorino-67c8dc5c7-flxgg" Apr 22 18:57:44.472713 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.472681 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gbdp\" (UniqueName: \"kubernetes.io/projected/c6211f98-24c1-42a2-b4fa-4dd56f164cbe-kube-api-access-8gbdp\") pod \"authorino-67c8dc5c7-flxgg\" (UID: \"c6211f98-24c1-42a2-b4fa-4dd56f164cbe\") " pod="kuadrant-system/authorino-67c8dc5c7-flxgg" Apr 22 18:57:44.559883 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.559852 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-67c8dc5c7-flxgg" Apr 22 18:57:44.686356 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:44.686327 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-67c8dc5c7-flxgg"] Apr 22 18:57:44.689036 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:57:44.689006 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6211f98_24c1_42a2_b4fa_4dd56f164cbe.slice/crio-9dd8fc1b8289f75700ff6b65126adfe00cda863bcac1e834b2cf377ab87d1914 WatchSource:0}: Error finding container 9dd8fc1b8289f75700ff6b65126adfe00cda863bcac1e834b2cf377ab87d1914: Status 404 returned error can't find the container with id 9dd8fc1b8289f75700ff6b65126adfe00cda863bcac1e834b2cf377ab87d1914 Apr 22 18:57:45.644983 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:45.644944 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-67c8dc5c7-flxgg" event={"ID":"c6211f98-24c1-42a2-b4fa-4dd56f164cbe","Type":"ContainerStarted","Data":"b8a5ee9ba5260fb19fcbfc4a4cce30a10b8f6b805838dda64c04cefb63fe1cfb"} Apr 22 18:57:45.644983 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:45.644986 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-67c8dc5c7-flxgg" event={"ID":"c6211f98-24c1-42a2-b4fa-4dd56f164cbe","Type":"ContainerStarted","Data":"9dd8fc1b8289f75700ff6b65126adfe00cda863bcac1e834b2cf377ab87d1914"} Apr 22 18:57:45.663443 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:45.663382 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-67c8dc5c7-flxgg" podStartSLOduration=1.151166809 podStartE2EDuration="1.663363166s" podCreationTimestamp="2026-04-22 18:57:44 +0000 UTC" firstStartedPulling="2026-04-22 18:57:44.690257166 +0000 UTC m=+927.977693322" lastFinishedPulling="2026-04-22 18:57:45.202453524 +0000 UTC m=+928.489889679" observedRunningTime="2026-04-22 18:57:45.661260707 +0000 UTC m=+928.948696885" watchObservedRunningTime="2026-04-22 18:57:45.663363166 +0000 UTC m=+928.950799345" Apr 22 18:57:45.693257 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:45.693208 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5f5b6f4f8d-hslqn"] Apr 22 18:57:45.693785 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:45.693729 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" podUID="d799c2a4-52c1-4688-93f7-69327f4f9388" containerName="authorino" containerID="cri-o://e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308" gracePeriod=30 Apr 22 18:57:45.946648 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:45.946628 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:57:46.080811 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.080777 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d799c2a4-52c1-4688-93f7-69327f4f9388-tls-cert\") pod \"d799c2a4-52c1-4688-93f7-69327f4f9388\" (UID: \"d799c2a4-52c1-4688-93f7-69327f4f9388\") " Apr 22 18:57:46.080984 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.080831 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6545q\" (UniqueName: \"kubernetes.io/projected/d799c2a4-52c1-4688-93f7-69327f4f9388-kube-api-access-6545q\") pod \"d799c2a4-52c1-4688-93f7-69327f4f9388\" (UID: \"d799c2a4-52c1-4688-93f7-69327f4f9388\") " Apr 22 18:57:46.083014 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.082978 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d799c2a4-52c1-4688-93f7-69327f4f9388-kube-api-access-6545q" (OuterVolumeSpecName: "kube-api-access-6545q") pod "d799c2a4-52c1-4688-93f7-69327f4f9388" (UID: "d799c2a4-52c1-4688-93f7-69327f4f9388"). InnerVolumeSpecName "kube-api-access-6545q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:57:46.091869 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.091841 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d799c2a4-52c1-4688-93f7-69327f4f9388-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "d799c2a4-52c1-4688-93f7-69327f4f9388" (UID: "d799c2a4-52c1-4688-93f7-69327f4f9388"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:57:46.182279 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.182196 2568 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d799c2a4-52c1-4688-93f7-69327f4f9388-tls-cert\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:57:46.182279 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.182224 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6545q\" (UniqueName: \"kubernetes.io/projected/d799c2a4-52c1-4688-93f7-69327f4f9388-kube-api-access-6545q\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:57:46.649807 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.649772 2568 generic.go:358] "Generic (PLEG): container finished" podID="d799c2a4-52c1-4688-93f7-69327f4f9388" containerID="e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308" exitCode=0 Apr 22 18:57:46.650264 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.649832 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" Apr 22 18:57:46.650264 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.649858 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" event={"ID":"d799c2a4-52c1-4688-93f7-69327f4f9388","Type":"ContainerDied","Data":"e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308"} Apr 22 18:57:46.650264 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.649897 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f5b6f4f8d-hslqn" event={"ID":"d799c2a4-52c1-4688-93f7-69327f4f9388","Type":"ContainerDied","Data":"64a324a73dbce6dea782cde83c2ed8c7ac557554dd60a5456bd917874e65d5fd"} Apr 22 18:57:46.650264 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.649912 2568 scope.go:117] "RemoveContainer" containerID="e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308" Apr 22 18:57:46.659382 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.659362 2568 scope.go:117] "RemoveContainer" containerID="e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308" Apr 22 18:57:46.659641 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:57:46.659625 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308\": container with ID starting with e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308 not found: ID does not exist" containerID="e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308" Apr 22 18:57:46.659704 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.659647 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308"} err="failed to get container status \"e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308\": rpc error: code = NotFound desc = could not find container \"e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308\": container with ID starting with e5aa6899574b43f0962aac242e99efddf729ab7b53acbd4c07b293e0edd46308 not found: ID does not exist" Apr 22 18:57:46.671763 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.671732 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5f5b6f4f8d-hslqn"] Apr 22 18:57:46.674840 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:46.674816 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5f5b6f4f8d-hslqn"] Apr 22 18:57:47.307588 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:57:47.307543 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d799c2a4-52c1-4688-93f7-69327f4f9388" path="/var/lib/kubelet/pods/d799c2a4-52c1-4688-93f7-69327f4f9388/volumes" Apr 22 18:59:09.441796 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:09.441700 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8497966894-4gbwd"] Apr 22 18:59:09.442298 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:09.441977 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-8497966894-4gbwd" podUID="0b9cecd7-a118-4650-a223-c0686eb09641" containerName="manager" containerID="cri-o://22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c" gracePeriod=10 Apr 22 18:59:09.689204 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:09.689183 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8497966894-4gbwd" Apr 22 18:59:09.862156 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:09.862110 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc9qq\" (UniqueName: \"kubernetes.io/projected/0b9cecd7-a118-4650-a223-c0686eb09641-kube-api-access-pc9qq\") pod \"0b9cecd7-a118-4650-a223-c0686eb09641\" (UID: \"0b9cecd7-a118-4650-a223-c0686eb09641\") " Apr 22 18:59:09.864281 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:09.864247 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9cecd7-a118-4650-a223-c0686eb09641-kube-api-access-pc9qq" (OuterVolumeSpecName: "kube-api-access-pc9qq") pod "0b9cecd7-a118-4650-a223-c0686eb09641" (UID: "0b9cecd7-a118-4650-a223-c0686eb09641"). InnerVolumeSpecName "kube-api-access-pc9qq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:59:09.963351 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:09.963317 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pc9qq\" (UniqueName: \"kubernetes.io/projected/0b9cecd7-a118-4650-a223-c0686eb09641-kube-api-access-pc9qq\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 18:59:10.015023 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.014984 2568 generic.go:358] "Generic (PLEG): container finished" podID="0b9cecd7-a118-4650-a223-c0686eb09641" containerID="22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c" exitCode=0 Apr 22 18:59:10.015198 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.015058 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8497966894-4gbwd" Apr 22 18:59:10.015198 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.015074 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8497966894-4gbwd" event={"ID":"0b9cecd7-a118-4650-a223-c0686eb09641","Type":"ContainerDied","Data":"22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c"} Apr 22 18:59:10.015198 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.015114 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8497966894-4gbwd" event={"ID":"0b9cecd7-a118-4650-a223-c0686eb09641","Type":"ContainerDied","Data":"77f098c64c73c97d46d81c3616247624d63ebb143c9de4952fe4f6e49c077b5d"} Apr 22 18:59:10.015198 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.015130 2568 scope.go:117] "RemoveContainer" containerID="22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c" Apr 22 18:59:10.024266 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.024247 2568 scope.go:117] "RemoveContainer" containerID="22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c" Apr 22 18:59:10.024636 ip-10-0-137-223 kubenswrapper[2568]: E0422 18:59:10.024616 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c\": container with ID starting with 22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c not found: ID does not exist" containerID="22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c" Apr 22 18:59:10.024700 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.024646 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c"} err="failed to get container status \"22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c\": rpc error: code = NotFound desc = could not find container \"22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c\": container with ID starting with 22c440ce32aa8531099611b4ce20dfd052f5602a6e61db357f63ad01e9696d0c not found: ID does not exist" Apr 22 18:59:10.036845 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.036817 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-8497966894-4gbwd"] Apr 22 18:59:10.040435 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.040407 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-8497966894-4gbwd"] Apr 22 18:59:10.775596 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.775564 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-8497966894-cq8nr"] Apr 22 18:59:10.775966 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.775945 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b9cecd7-a118-4650-a223-c0686eb09641" containerName="manager" Apr 22 18:59:10.775966 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.775956 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9cecd7-a118-4650-a223-c0686eb09641" containerName="manager" Apr 22 18:59:10.776041 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.775979 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d799c2a4-52c1-4688-93f7-69327f4f9388" containerName="authorino" Apr 22 18:59:10.776041 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.775985 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="d799c2a4-52c1-4688-93f7-69327f4f9388" containerName="authorino" Apr 22 18:59:10.776104 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.776047 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="d799c2a4-52c1-4688-93f7-69327f4f9388" containerName="authorino" Apr 22 18:59:10.776104 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.776060 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b9cecd7-a118-4650-a223-c0686eb09641" containerName="manager" Apr 22 18:59:10.780459 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.780441 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8497966894-cq8nr" Apr 22 18:59:10.783042 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.783018 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-jh8rj\"" Apr 22 18:59:10.785603 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.785578 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8497966894-cq8nr"] Apr 22 18:59:10.873971 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.873934 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5dp\" (UniqueName: \"kubernetes.io/projected/fc38e4c4-1055-425c-aa6b-ac1f28e5b563-kube-api-access-hr5dp\") pod \"maas-controller-8497966894-cq8nr\" (UID: \"fc38e4c4-1055-425c-aa6b-ac1f28e5b563\") " pod="opendatahub/maas-controller-8497966894-cq8nr" Apr 22 18:59:10.974978 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.974942 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5dp\" (UniqueName: \"kubernetes.io/projected/fc38e4c4-1055-425c-aa6b-ac1f28e5b563-kube-api-access-hr5dp\") pod \"maas-controller-8497966894-cq8nr\" (UID: \"fc38e4c4-1055-425c-aa6b-ac1f28e5b563\") " pod="opendatahub/maas-controller-8497966894-cq8nr" Apr 22 18:59:10.982709 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:10.982682 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5dp\" (UniqueName: \"kubernetes.io/projected/fc38e4c4-1055-425c-aa6b-ac1f28e5b563-kube-api-access-hr5dp\") pod \"maas-controller-8497966894-cq8nr\" (UID: \"fc38e4c4-1055-425c-aa6b-ac1f28e5b563\") " pod="opendatahub/maas-controller-8497966894-cq8nr" Apr 22 18:59:11.092631 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:11.092605 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-8497966894-cq8nr" Apr 22 18:59:11.307021 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:11.306983 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9cecd7-a118-4650-a223-c0686eb09641" path="/var/lib/kubelet/pods/0b9cecd7-a118-4650-a223-c0686eb09641/volumes" Apr 22 18:59:11.424358 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:11.424332 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-8497966894-cq8nr"] Apr 22 18:59:11.425376 ip-10-0-137-223 kubenswrapper[2568]: W0422 18:59:11.425350 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc38e4c4_1055_425c_aa6b_ac1f28e5b563.slice/crio-9694d749cb02f487a4ff36076c80bb317d10ed1090dd22136a22413bee64268e WatchSource:0}: Error finding container 9694d749cb02f487a4ff36076c80bb317d10ed1090dd22136a22413bee64268e: Status 404 returned error can't find the container with id 9694d749cb02f487a4ff36076c80bb317d10ed1090dd22136a22413bee64268e Apr 22 18:59:12.025168 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:12.025138 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8497966894-cq8nr" event={"ID":"fc38e4c4-1055-425c-aa6b-ac1f28e5b563","Type":"ContainerStarted","Data":"531fe355db6c90451a898e44d6f2c125ba59693c4a4dfe7a8220971ba21b32f0"} Apr 22 18:59:12.025448 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:12.025178 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-8497966894-cq8nr" event={"ID":"fc38e4c4-1055-425c-aa6b-ac1f28e5b563","Type":"ContainerStarted","Data":"9694d749cb02f487a4ff36076c80bb317d10ed1090dd22136a22413bee64268e"} Apr 22 18:59:13.035817 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:13.031847 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-8497966894-cq8nr" Apr 22 18:59:13.051030 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:13.050971 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-8497966894-cq8nr" podStartSLOduration=2.516540797 podStartE2EDuration="3.050955966s" podCreationTimestamp="2026-04-22 18:59:10 +0000 UTC" firstStartedPulling="2026-04-22 18:59:11.42667476 +0000 UTC m=+1014.714110916" lastFinishedPulling="2026-04-22 18:59:11.961089929 +0000 UTC m=+1015.248526085" observedRunningTime="2026-04-22 18:59:13.047816056 +0000 UTC m=+1016.335252245" watchObservedRunningTime="2026-04-22 18:59:13.050955966 +0000 UTC m=+1016.338392144" Apr 22 18:59:25.043858 ip-10-0-137-223 kubenswrapper[2568]: I0422 18:59:25.043825 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-8497966894-cq8nr" Apr 22 19:00:00.140600 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:00.140563 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29614740-g6mdv"] Apr 22 19:00:00.144553 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:00.144530 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" Apr 22 19:00:00.147070 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:00.147048 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-4fsbs\"" Apr 22 19:00:00.151345 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:00.151322 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614740-g6mdv"] Apr 22 19:00:00.227698 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:00.227665 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs8sb\" (UniqueName: \"kubernetes.io/projected/b9e76956-0b88-44fe-8b0a-1a32ae3b5538-kube-api-access-fs8sb\") pod \"maas-api-key-cleanup-29614740-g6mdv\" (UID: \"b9e76956-0b88-44fe-8b0a-1a32ae3b5538\") " pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" Apr 22 19:00:00.328860 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:00.328805 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fs8sb\" (UniqueName: \"kubernetes.io/projected/b9e76956-0b88-44fe-8b0a-1a32ae3b5538-kube-api-access-fs8sb\") pod \"maas-api-key-cleanup-29614740-g6mdv\" (UID: \"b9e76956-0b88-44fe-8b0a-1a32ae3b5538\") " pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" Apr 22 19:00:00.337419 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:00.337382 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs8sb\" (UniqueName: \"kubernetes.io/projected/b9e76956-0b88-44fe-8b0a-1a32ae3b5538-kube-api-access-fs8sb\") pod \"maas-api-key-cleanup-29614740-g6mdv\" (UID: \"b9e76956-0b88-44fe-8b0a-1a32ae3b5538\") " pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" Apr 22 19:00:00.456572 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:00.456472 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" Apr 22 19:00:00.578115 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:00.577958 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614740-g6mdv"] Apr 22 19:00:00.580879 ip-10-0-137-223 kubenswrapper[2568]: W0422 19:00:00.580851 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9e76956_0b88_44fe_8b0a_1a32ae3b5538.slice/crio-2a10febab9961f5cad7cec8b5ea379a76da9f04f7ee1ce8d8fd146fa7396f15c WatchSource:0}: Error finding container 2a10febab9961f5cad7cec8b5ea379a76da9f04f7ee1ce8d8fd146fa7396f15c: Status 404 returned error can't find the container with id 2a10febab9961f5cad7cec8b5ea379a76da9f04f7ee1ce8d8fd146fa7396f15c Apr 22 19:00:01.217237 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:01.217205 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" event={"ID":"b9e76956-0b88-44fe-8b0a-1a32ae3b5538","Type":"ContainerStarted","Data":"2a10febab9961f5cad7cec8b5ea379a76da9f04f7ee1ce8d8fd146fa7396f15c"} Apr 22 19:00:02.222200 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:02.222165 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" event={"ID":"b9e76956-0b88-44fe-8b0a-1a32ae3b5538","Type":"ContainerStarted","Data":"fbc6a35b12d8dd62d948c8f1b733eeb370336f3a2f0bb5a1df80456ed4ab31a1"} Apr 22 19:00:02.239459 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:02.239407 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" podStartSLOduration=0.926325382 podStartE2EDuration="2.239392962s" podCreationTimestamp="2026-04-22 19:00:00 +0000 UTC" firstStartedPulling="2026-04-22 19:00:00.582631597 +0000 UTC m=+1063.870067753" lastFinishedPulling="2026-04-22 19:00:01.895699174 +0000 UTC m=+1065.183135333" observedRunningTime="2026-04-22 19:00:02.236942655 +0000 UTC m=+1065.524378846" watchObservedRunningTime="2026-04-22 19:00:02.239392962 +0000 UTC m=+1065.526829139" Apr 22 19:00:23.307368 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:23.307334 2568 generic.go:358] "Generic (PLEG): container finished" podID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" containerID="fbc6a35b12d8dd62d948c8f1b733eeb370336f3a2f0bb5a1df80456ed4ab31a1" exitCode=6 Apr 22 19:00:23.307753 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:23.307373 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" event={"ID":"b9e76956-0b88-44fe-8b0a-1a32ae3b5538","Type":"ContainerDied","Data":"fbc6a35b12d8dd62d948c8f1b733eeb370336f3a2f0bb5a1df80456ed4ab31a1"} Apr 22 19:00:23.307812 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:23.307756 2568 scope.go:117] "RemoveContainer" containerID="fbc6a35b12d8dd62d948c8f1b733eeb370336f3a2f0bb5a1df80456ed4ab31a1" Apr 22 19:00:24.313427 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:24.313391 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" event={"ID":"b9e76956-0b88-44fe-8b0a-1a32ae3b5538","Type":"ContainerStarted","Data":"2e8871838fb9fac2e1d02b1635c5a57273c431848d3350ada95cece776b90fbe"} Apr 22 19:00:44.397046 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:44.397010 2568 generic.go:358] "Generic (PLEG): container finished" podID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" containerID="2e8871838fb9fac2e1d02b1635c5a57273c431848d3350ada95cece776b90fbe" exitCode=6 Apr 22 19:00:44.397563 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:44.397080 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" event={"ID":"b9e76956-0b88-44fe-8b0a-1a32ae3b5538","Type":"ContainerDied","Data":"2e8871838fb9fac2e1d02b1635c5a57273c431848d3350ada95cece776b90fbe"} Apr 22 19:00:44.397563 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:44.397128 2568 scope.go:117] "RemoveContainer" containerID="fbc6a35b12d8dd62d948c8f1b733eeb370336f3a2f0bb5a1df80456ed4ab31a1" Apr 22 19:00:44.397563 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:00:44.397432 2568 scope.go:117] "RemoveContainer" containerID="2e8871838fb9fac2e1d02b1635c5a57273c431848d3350ada95cece776b90fbe" Apr 22 19:00:44.397731 ip-10-0-137-223 kubenswrapper[2568]: E0422 19:00:44.397713 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29614740-g6mdv_opendatahub(b9e76956-0b88-44fe-8b0a-1a32ae3b5538)\"" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" podUID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" Apr 22 19:01:00.015527 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.012445 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614740-g6mdv"] Apr 22 19:01:00.145597 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.145571 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" Apr 22 19:01:00.270911 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.270835 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs8sb\" (UniqueName: \"kubernetes.io/projected/b9e76956-0b88-44fe-8b0a-1a32ae3b5538-kube-api-access-fs8sb\") pod \"b9e76956-0b88-44fe-8b0a-1a32ae3b5538\" (UID: \"b9e76956-0b88-44fe-8b0a-1a32ae3b5538\") " Apr 22 19:01:00.273167 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.273131 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e76956-0b88-44fe-8b0a-1a32ae3b5538-kube-api-access-fs8sb" (OuterVolumeSpecName: "kube-api-access-fs8sb") pod "b9e76956-0b88-44fe-8b0a-1a32ae3b5538" (UID: "b9e76956-0b88-44fe-8b0a-1a32ae3b5538"). InnerVolumeSpecName "kube-api-access-fs8sb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:01:00.371647 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.371609 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fs8sb\" (UniqueName: \"kubernetes.io/projected/b9e76956-0b88-44fe-8b0a-1a32ae3b5538-kube-api-access-fs8sb\") on node \"ip-10-0-137-223.ec2.internal\" DevicePath \"\"" Apr 22 19:01:00.464822 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.464793 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" Apr 22 19:01:00.464996 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.464793 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29614740-g6mdv" event={"ID":"b9e76956-0b88-44fe-8b0a-1a32ae3b5538","Type":"ContainerDied","Data":"2a10febab9961f5cad7cec8b5ea379a76da9f04f7ee1ce8d8fd146fa7396f15c"} Apr 22 19:01:00.464996 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.464913 2568 scope.go:117] "RemoveContainer" containerID="2e8871838fb9fac2e1d02b1635c5a57273c431848d3350ada95cece776b90fbe" Apr 22 19:01:00.486953 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.486920 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614740-g6mdv"] Apr 22 19:01:00.489755 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:00.489732 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29614740-g6mdv"] Apr 22 19:01:01.307287 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:01:01.307253 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" path="/var/lib/kubelet/pods/b9e76956-0b88-44fe-8b0a-1a32ae3b5538/volumes" Apr 22 19:10:28.672554 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.672517 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x"] Apr 22 19:10:28.673055 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.672921 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" containerName="cleanup" Apr 22 19:10:28.673055 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.672934 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" containerName="cleanup" Apr 22 19:10:28.673055 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.672946 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" containerName="cleanup" Apr 22 19:10:28.673055 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.672952 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" containerName="cleanup" Apr 22 19:10:28.673055 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.673032 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" containerName="cleanup" Apr 22 19:10:28.673055 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.673044 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="b9e76956-0b88-44fe-8b0a-1a32ae3b5538" containerName="cleanup" Apr 22 19:10:28.675880 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.675863 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:10:28.678421 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.678403 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-brdcv\"" Apr 22 19:10:28.688260 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.688238 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x"] Apr 22 19:10:28.771774 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.771729 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsk8l\" (UniqueName: \"kubernetes.io/projected/f513e8c2-bf9f-4488-9ab9-2e215b85ccf6-kube-api-access-fsk8l\") pod \"kuadrant-operator-controller-manager-55c7f4c975-qh58x\" (UID: \"f513e8c2-bf9f-4488-9ab9-2e215b85ccf6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:10:28.771976 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.771871 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f513e8c2-bf9f-4488-9ab9-2e215b85ccf6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-qh58x\" (UID: \"f513e8c2-bf9f-4488-9ab9-2e215b85ccf6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:10:28.872466 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.872423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f513e8c2-bf9f-4488-9ab9-2e215b85ccf6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-qh58x\" (UID: \"f513e8c2-bf9f-4488-9ab9-2e215b85ccf6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:10:28.872664 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.872479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsk8l\" (UniqueName: \"kubernetes.io/projected/f513e8c2-bf9f-4488-9ab9-2e215b85ccf6-kube-api-access-fsk8l\") pod \"kuadrant-operator-controller-manager-55c7f4c975-qh58x\" (UID: \"f513e8c2-bf9f-4488-9ab9-2e215b85ccf6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:10:28.872908 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.872885 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f513e8c2-bf9f-4488-9ab9-2e215b85ccf6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-qh58x\" (UID: \"f513e8c2-bf9f-4488-9ab9-2e215b85ccf6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:10:28.881427 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.881403 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsk8l\" (UniqueName: \"kubernetes.io/projected/f513e8c2-bf9f-4488-9ab9-2e215b85ccf6-kube-api-access-fsk8l\") pod \"kuadrant-operator-controller-manager-55c7f4c975-qh58x\" (UID: \"f513e8c2-bf9f-4488-9ab9-2e215b85ccf6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:10:28.987451 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:28.987365 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:10:29.128531 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:29.126803 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x"] Apr 22 19:10:29.134970 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:29.134947 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:10:29.748985 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:29.748947 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" event={"ID":"f513e8c2-bf9f-4488-9ab9-2e215b85ccf6","Type":"ContainerStarted","Data":"101348bdffa6a650c6ee1e280dec5abda785f54d557e3f76ccf15ce3ce2f750c"} Apr 22 19:10:29.748985 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:29.748989 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" event={"ID":"f513e8c2-bf9f-4488-9ab9-2e215b85ccf6","Type":"ContainerStarted","Data":"32c1f4b16cff968408afd48e2b7b03dc25743a9d04eacc1f8431c4f73601e8a7"} Apr 22 19:10:29.749531 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:29.749021 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:10:29.779303 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:29.779252 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" podStartSLOduration=1.779235229 podStartE2EDuration="1.779235229s" podCreationTimestamp="2026-04-22 19:10:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:10:29.777093664 +0000 UTC m=+1693.064529845" watchObservedRunningTime="2026-04-22 19:10:29.779235229 +0000 UTC m=+1693.066671408" Apr 22 19:10:40.755853 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:10:40.755775 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-qh58x" Apr 22 19:20:10.649409 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:10.649369 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-67c8dc5c7-flxgg_c6211f98-24c1-42a2-b4fa-4dd56f164cbe/authorino/0.log" Apr 22 19:20:15.193483 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:15.193447 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-8497966894-cq8nr_fc38e4c4-1055-425c-aa6b-ac1f28e5b563/manager/0.log" Apr 22 19:20:15.567067 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:15.567034 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-dd89cc56c-6szw2_4f72cd4e-5f6c-4405-a2ba-2700077a5303/manager/0.log" Apr 22 19:20:15.805402 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:15.805366 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-qfcrx_fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd/postgres/0.log" Apr 22 19:20:16.539368 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.539339 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7_9286965c-ebc2-4e26-bac3-c03239bdf69d/util/0.log" Apr 22 19:20:16.546605 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.546581 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7_9286965c-ebc2-4e26-bac3-c03239bdf69d/pull/0.log" Apr 22 19:20:16.552709 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.552685 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7_9286965c-ebc2-4e26-bac3-c03239bdf69d/extract/0.log" Apr 22 19:20:16.672307 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.672266 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt_7e694063-aa04-400c-b495-0599ed097240/util/0.log" Apr 22 19:20:16.684982 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.684959 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt_7e694063-aa04-400c-b495-0599ed097240/pull/0.log" Apr 22 19:20:16.696413 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.696383 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt_7e694063-aa04-400c-b495-0599ed097240/extract/0.log" Apr 22 19:20:16.802928 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.802900 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh_2c232082-9fba-4b55-9a9d-03825ab46808/util/0.log" Apr 22 19:20:16.809643 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.809620 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh_2c232082-9fba-4b55-9a9d-03825ab46808/pull/0.log" Apr 22 19:20:16.815461 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.815432 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh_2c232082-9fba-4b55-9a9d-03825ab46808/extract/0.log" Apr 22 19:20:16.921762 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.921732 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8_a0f7e806-cb23-46c1-8e64-e9a40b758857/util/0.log" Apr 22 19:20:16.930965 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.930938 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8_a0f7e806-cb23-46c1-8e64-e9a40b758857/pull/0.log" Apr 22 19:20:16.938036 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:16.938015 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8_a0f7e806-cb23-46c1-8e64-e9a40b758857/extract/0.log" Apr 22 19:20:17.063232 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:17.063154 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-67c8dc5c7-flxgg_c6211f98-24c1-42a2-b4fa-4dd56f164cbe/authorino/0.log" Apr 22 19:20:17.210931 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:17.210905 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-qct57_b752037c-f7bc-4322-b5aa-7ff269e24c25/manager/0.log" Apr 22 19:20:17.453280 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:17.453196 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-vhnlx_36212567-0437-4616-8bf3-8a7c2aecc713/kuadrant-console-plugin/0.log" Apr 22 19:20:17.711467 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:17.711387 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-qh58x_f513e8c2-bf9f-4488-9ab9-2e215b85ccf6/manager/0.log" Apr 22 19:20:18.302204 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:18.302170 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f6zc79_d6592537-fd96-4b35-a36c-623383f391e5/istio-proxy/0.log" Apr 22 19:20:18.756320 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:18.756236 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-rhh77_cbb27c2e-e776-4b45-9d72-a8e02e0da32a/istio-proxy/0.log" Apr 22 19:20:19.194167 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:19.194137 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7_243dc079-6420-4b04-962f-61f2a006a05c/storage-initializer/0.log" Apr 22 19:20:19.202688 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:19.202654 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-mwff7_243dc079-6420-4b04-962f-61f2a006a05c/main/0.log" Apr 22 19:20:26.439698 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:26.439663 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8krsn_97b448ce-3744-4ce3-8f4c-027793be2c53/global-pull-secret-syncer/0.log" Apr 22 19:20:26.548418 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:26.548381 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-2sskf_b31604e3-953f-4f25-9eb6-b50d500412bb/konnectivity-agent/0.log" Apr 22 19:20:26.713950 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:26.713864 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-223.ec2.internal_10feef451d98ade143f78a7b02301421/haproxy/0.log" Apr 22 19:20:30.619856 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:30.619824 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7_9286965c-ebc2-4e26-bac3-c03239bdf69d/extract/0.log" Apr 22 19:20:30.647408 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:30.647378 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7_9286965c-ebc2-4e26-bac3-c03239bdf69d/util/0.log" Apr 22 19:20:30.685420 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:30.685392 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759l5cf7_9286965c-ebc2-4e26-bac3-c03239bdf69d/pull/0.log" Apr 22 19:20:30.729932 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:30.729904 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt_7e694063-aa04-400c-b495-0599ed097240/extract/0.log" Apr 22 19:20:30.763730 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:30.763703 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt_7e694063-aa04-400c-b495-0599ed097240/util/0.log" Apr 22 19:20:30.819577 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:30.819552 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e052gbt_7e694063-aa04-400c-b495-0599ed097240/pull/0.log" Apr 22 19:20:30.883955 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:30.883872 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh_2c232082-9fba-4b55-9a9d-03825ab46808/extract/0.log" Apr 22 19:20:30.912410 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:30.912384 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh_2c232082-9fba-4b55-9a9d-03825ab46808/util/0.log" Apr 22 19:20:30.949390 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:30.949358 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73bpwsh_2c232082-9fba-4b55-9a9d-03825ab46808/pull/0.log" Apr 22 19:20:31.004464 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:31.004434 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8_a0f7e806-cb23-46c1-8e64-e9a40b758857/extract/0.log" Apr 22 19:20:31.039392 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:31.039365 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8_a0f7e806-cb23-46c1-8e64-e9a40b758857/util/0.log" Apr 22 19:20:31.075650 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:31.075611 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1pnws8_a0f7e806-cb23-46c1-8e64-e9a40b758857/pull/0.log" Apr 22 19:20:31.338313 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:31.338262 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-67c8dc5c7-flxgg_c6211f98-24c1-42a2-b4fa-4dd56f164cbe/authorino/0.log" Apr 22 19:20:31.398946 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:31.398917 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-qct57_b752037c-f7bc-4322-b5aa-7ff269e24c25/manager/0.log" Apr 22 19:20:31.492328 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:31.492297 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-vhnlx_36212567-0437-4616-8bf3-8a7c2aecc713/kuadrant-console-plugin/0.log" Apr 22 19:20:31.641350 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:31.641270 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-qh58x_f513e8c2-bf9f-4488-9ab9-2e215b85ccf6/manager/0.log" Apr 22 19:20:33.045512 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.045469 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c4b20925-76d9-43ec-9e89-1ad0ad8b85ad/alertmanager/0.log" Apr 22 19:20:33.067691 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.067656 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c4b20925-76d9-43ec-9e89-1ad0ad8b85ad/config-reloader/0.log" Apr 22 19:20:33.098668 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.098639 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c4b20925-76d9-43ec-9e89-1ad0ad8b85ad/kube-rbac-proxy-web/0.log" Apr 22 19:20:33.124639 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.124611 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c4b20925-76d9-43ec-9e89-1ad0ad8b85ad/kube-rbac-proxy/0.log" Apr 22 19:20:33.171519 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.171478 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c4b20925-76d9-43ec-9e89-1ad0ad8b85ad/kube-rbac-proxy-metric/0.log" Apr 22 19:20:33.207358 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.207327 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c4b20925-76d9-43ec-9e89-1ad0ad8b85ad/prom-label-proxy/0.log" Apr 22 19:20:33.239167 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.239135 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_c4b20925-76d9-43ec-9e89-1ad0ad8b85ad/init-config-reloader/0.log" Apr 22 19:20:33.448540 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.448514 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-95b554d55-6w22l_32fb26b5-bd68-4510-9211-df5d18a08f2a/metrics-server/0.log" Apr 22 19:20:33.480511 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.480480 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-fnqsx_8f6b62ea-4987-4f68-9fbb-e65c98816700/monitoring-plugin/0.log" Apr 22 19:20:33.516370 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.516331 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-495hl_7f7befd2-b691-4c66-8a72-803d0d1ef203/node-exporter/0.log" Apr 22 19:20:33.543754 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.543720 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-495hl_7f7befd2-b691-4c66-8a72-803d0d1ef203/kube-rbac-proxy/0.log" Apr 22 19:20:33.572527 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.572487 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-495hl_7f7befd2-b691-4c66-8a72-803d0d1ef203/init-textfile/0.log" Apr 22 19:20:33.777478 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.777406 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-fkg94_4259dbde-47d1-4b05-a472-b19c8b4af292/kube-rbac-proxy-main/0.log" Apr 22 19:20:33.798237 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.798209 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-fkg94_4259dbde-47d1-4b05-a472-b19c8b4af292/kube-rbac-proxy-self/0.log" Apr 22 19:20:33.821718 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:33.821692 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-fkg94_4259dbde-47d1-4b05-a472-b19c8b4af292/openshift-state-metrics/0.log" Apr 22 19:20:34.104289 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.104255 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7bb9d8c5b9-s7rr2_5153a39e-9b48-415d-8a21-4879a686d80b/telemeter-client/0.log" Apr 22 19:20:34.125429 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.125400 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7bb9d8c5b9-s7rr2_5153a39e-9b48-415d-8a21-4879a686d80b/reload/0.log" Apr 22 19:20:34.146982 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.146949 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7bb9d8c5b9-s7rr2_5153a39e-9b48-415d-8a21-4879a686d80b/kube-rbac-proxy/0.log" Apr 22 19:20:34.179095 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.179066 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd7484947-44nwn_e36d6c01-6e8b-4001-8601-07ca12c15e68/thanos-query/0.log" Apr 22 19:20:34.207103 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.207068 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd7484947-44nwn_e36d6c01-6e8b-4001-8601-07ca12c15e68/kube-rbac-proxy-web/0.log" Apr 22 19:20:34.228557 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.228526 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd7484947-44nwn_e36d6c01-6e8b-4001-8601-07ca12c15e68/kube-rbac-proxy/0.log" Apr 22 19:20:34.249355 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.249325 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd7484947-44nwn_e36d6c01-6e8b-4001-8601-07ca12c15e68/prom-label-proxy/0.log" Apr 22 19:20:34.271021 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.270990 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd7484947-44nwn_e36d6c01-6e8b-4001-8601-07ca12c15e68/kube-rbac-proxy-rules/0.log" Apr 22 19:20:34.292425 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.292401 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7bd7484947-44nwn_e36d6c01-6e8b-4001-8601-07ca12c15e68/kube-rbac-proxy-metrics/0.log" Apr 22 19:20:34.593901 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.593864 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz"] Apr 22 19:20:34.597697 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.597674 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.600281 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.600256 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tz8b7\"/\"kube-root-ca.crt\"" Apr 22 19:20:34.600417 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.600259 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tz8b7\"/\"openshift-service-ca.crt\"" Apr 22 19:20:34.601042 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.601024 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tz8b7\"/\"default-dockercfg-7p5tx\"" Apr 22 19:20:34.609245 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.609222 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz"] Apr 22 19:20:34.735943 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.735908 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-sys\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.736126 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.735959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-lib-modules\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.736126 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.736026 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkdl6\" (UniqueName: \"kubernetes.io/projected/d5335808-9488-4785-92e0-76c84e13848a-kube-api-access-wkdl6\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.736126 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.736062 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-proc\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.736126 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.736122 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-podres\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.837558 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.837496 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-proc\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.837738 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.837597 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-podres\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.837738 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.837629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-sys\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.837738 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.837627 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-proc\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.837738 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.837690 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-sys\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.837738 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.837692 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-lib-modules\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.837905 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.837751 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-podres\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.837905 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.837770 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5335808-9488-4785-92e0-76c84e13848a-lib-modules\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.837905 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.837752 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkdl6\" (UniqueName: \"kubernetes.io/projected/d5335808-9488-4785-92e0-76c84e13848a-kube-api-access-wkdl6\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.847043 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.846994 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkdl6\" (UniqueName: \"kubernetes.io/projected/d5335808-9488-4785-92e0-76c84e13848a-kube-api-access-wkdl6\") pod \"perf-node-gather-daemonset-jm6mz\" (UID: \"d5335808-9488-4785-92e0-76c84e13848a\") " pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:34.909828 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:34.909786 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:35.040350 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:35.040324 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz"] Apr 22 19:20:35.041556 ip-10-0-137-223 kubenswrapper[2568]: W0422 19:20:35.041522 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd5335808_9488_4785_92e0_76c84e13848a.slice/crio-562cd75bcb3d1ab3634b144d79b5fd7a26b7e8855f03fbcd665e72960a76d121 WatchSource:0}: Error finding container 562cd75bcb3d1ab3634b144d79b5fd7a26b7e8855f03fbcd665e72960a76d121: Status 404 returned error can't find the container with id 562cd75bcb3d1ab3634b144d79b5fd7a26b7e8855f03fbcd665e72960a76d121 Apr 22 19:20:35.043280 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:35.043263 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:20:35.237167 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:35.237127 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" event={"ID":"d5335808-9488-4785-92e0-76c84e13848a","Type":"ContainerStarted","Data":"5e006d20e2c4e03735cdfde3388f92ca29c1d5ee130a9f73d819be4b0b20d3cc"} Apr 22 19:20:35.237167 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:35.237166 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" event={"ID":"d5335808-9488-4785-92e0-76c84e13848a","Type":"ContainerStarted","Data":"562cd75bcb3d1ab3634b144d79b5fd7a26b7e8855f03fbcd665e72960a76d121"} Apr 22 19:20:35.237611 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:35.237182 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:35.253707 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:35.253655 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" podStartSLOduration=1.253641357 podStartE2EDuration="1.253641357s" podCreationTimestamp="2026-04-22 19:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:20:35.251682926 +0000 UTC m=+2298.539119105" watchObservedRunningTime="2026-04-22 19:20:35.253641357 +0000 UTC m=+2298.541077534" Apr 22 19:20:36.328104 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:36.328070 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6c749d644c-mnmb8_657cc86f-ffcb-47c5-b5ad-39f5262418c9/console/0.log" Apr 22 19:20:36.362559 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:36.362486 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-zchbz_d47db21d-84ef-412a-8e2e-dd9b3fc6599b/download-server/0.log" Apr 22 19:20:37.687164 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:37.687133 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j7zw4_d0da4783-9f5c-40c3-80f0-155df59f22de/dns/0.log" Apr 22 19:20:37.709742 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:37.709718 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-j7zw4_d0da4783-9f5c-40c3-80f0-155df59f22de/kube-rbac-proxy/0.log" Apr 22 19:20:37.790015 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:37.789981 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hn296_7b2ce3e8-c29b-4c51-bc68-022864d2d2fa/dns-node-resolver/0.log" Apr 22 19:20:38.274356 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:38.274322 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-84d66bbd7f-zv9dl_5e39c6e8-66e9-40ad-af3a-752c97681e94/registry/0.log" Apr 22 19:20:38.359393 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:38.359364 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pnwpz_fa08d78c-48ce-42a5-85a9-3f4ae1d8a468/node-ca/0.log" Apr 22 19:20:39.156829 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:39.156796 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f6zc79_d6592537-fd96-4b35-a36c-623383f391e5/istio-proxy/0.log" Apr 22 19:20:39.412405 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:39.412323 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-rhh77_cbb27c2e-e776-4b45-9d72-a8e02e0da32a/istio-proxy/0.log" Apr 22 19:20:39.954993 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:39.954958 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9c2rk_b09bc722-d952-4713-bbad-524034fa2063/serve-healthcheck-canary/0.log" Apr 22 19:20:40.505283 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:40.505245 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zw28s_b8003c92-2bc0-4825-974f-12470b332830/insights-operator/0.log" Apr 22 19:20:40.506905 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:40.506885 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-zw28s_b8003c92-2bc0-4825-974f-12470b332830/insights-operator/1.log" Apr 22 19:20:40.596414 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:40.596387 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hhfz5_dba4c78e-e2a1-46fe-af9a-af4d512b4e4a/kube-rbac-proxy/0.log" Apr 22 19:20:40.616797 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:40.616767 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hhfz5_dba4c78e-e2a1-46fe-af9a-af4d512b4e4a/exporter/0.log" Apr 22 19:20:40.638303 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:40.638274 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hhfz5_dba4c78e-e2a1-46fe-af9a-af4d512b4e4a/extractor/0.log" Apr 22 19:20:41.251332 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:41.251305 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tz8b7/perf-node-gather-daemonset-jm6mz" Apr 22 19:20:42.681169 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:42.681138 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-8497966894-cq8nr_fc38e4c4-1055-425c-aa6b-ac1f28e5b563/manager/0.log" Apr 22 19:20:42.782254 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:42.782223 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-dd89cc56c-6szw2_4f72cd4e-5f6c-4405-a2ba-2700077a5303/manager/0.log" Apr 22 19:20:42.831524 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:42.831482 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-qfcrx_fa4fc7d7-4ce3-456a-adbb-6086dac4b4fd/postgres/0.log" Apr 22 19:20:44.207163 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:44.207054 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-54dc496758-6kd5m_0ddafe4a-a370-476f-92f1-e4a10f75f714/manager/0.log" Apr 22 19:20:50.151949 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.151919 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bvmpf_7acfec53-d2af-4a00-a80f-87b1b1b045b1/kube-multus-additional-cni-plugins/0.log" Apr 22 19:20:50.201793 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.201755 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bvmpf_7acfec53-d2af-4a00-a80f-87b1b1b045b1/egress-router-binary-copy/0.log" Apr 22 19:20:50.240270 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.240232 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bvmpf_7acfec53-d2af-4a00-a80f-87b1b1b045b1/cni-plugins/0.log" Apr 22 19:20:50.273559 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.273534 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bvmpf_7acfec53-d2af-4a00-a80f-87b1b1b045b1/bond-cni-plugin/0.log" Apr 22 19:20:50.323286 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.323256 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bvmpf_7acfec53-d2af-4a00-a80f-87b1b1b045b1/routeoverride-cni/0.log" Apr 22 19:20:50.347435 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.347400 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bvmpf_7acfec53-d2af-4a00-a80f-87b1b1b045b1/whereabouts-cni-bincopy/0.log" Apr 22 19:20:50.370178 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.370152 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bvmpf_7acfec53-d2af-4a00-a80f-87b1b1b045b1/whereabouts-cni/0.log" Apr 22 19:20:50.586940 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.586911 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-w85bl_72d04d6b-6ce2-4128-a4ed-8adc4a19b7bc/kube-multus/0.log" Apr 22 19:20:50.762345 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.762309 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p7h9z_42b4b135-c4b0-4460-84ed-684f25a4436d/network-metrics-daemon/0.log" Apr 22 19:20:50.797565 ip-10-0-137-223 kubenswrapper[2568]: I0422 19:20:50.797535 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p7h9z_42b4b135-c4b0-4460-84ed-684f25a4436d/kube-rbac-proxy/0.log"