Apr 16 13:11:37.195738 ip-10-0-137-242 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:11:37.629234 ip-10-0-137-242 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:11:37.629234 ip-10-0-137-242 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:11:37.629234 ip-10-0-137-242 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:11:37.629234 ip-10-0-137-242 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:11:37.629234 ip-10-0-137-242 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:11:37.631946 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.631856 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:11:37.638925 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638899 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:37.638925 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638923 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:37.638925 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638929 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:37.638925 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638933 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638938 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638943 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638948 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638952 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638956 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638961 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638967 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638971 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638976 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638985 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638989 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638994 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.638999 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639004 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639008 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639012 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639016 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639021 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639026 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:37.639164 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639030 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639035 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639043 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639048 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639053 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639058 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639062 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639070 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639075 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639079 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639084 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639088 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639092 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639100 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639111 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639116 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639121 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639126 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639131 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639135 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:37.639977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639141 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639145 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639150 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639154 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639159 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639163 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639168 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639177 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639181 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639186 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639191 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639196 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639200 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639204 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639209 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639214 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639219 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639223 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639228 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:37.640787 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639237 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639242 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639250 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639254 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639259 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639263 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639267 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639273 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639277 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639282 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639286 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639291 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639300 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639305 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639310 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639315 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639320 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639324 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639328 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639333 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:37.641431 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639337 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639342 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639347 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.639351 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.640404 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.640415 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.640419 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.640424 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.640429 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.640438 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.640443 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641023 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641030 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641034 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641037 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641041 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641045 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641048 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641050 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:37.641966 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641053 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641056 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641059 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641062 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641065 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641067 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641070 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641073 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641076 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641079 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641082 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641084 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641087 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641090 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641092 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641095 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641097 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641100 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641103 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641107 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:37.642466 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641109 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641113 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641115 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641118 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641122 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641125 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641128 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641130 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641133 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641136 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641138 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641143 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641146 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641149 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641152 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641155 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641157 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641160 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641164 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:37.642970 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641168 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641171 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641174 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641176 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641181 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641188 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641191 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641193 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641198 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641201 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641204 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641208 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641211 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641217 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641223 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641226 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641230 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641234 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641239 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641243 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:37.643439 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641247 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641251 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641257 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641260 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641263 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641266 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641270 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641273 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641275 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641278 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641281 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.641283 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642073 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642082 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642089 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642093 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642099 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642102 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642107 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642111 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642115 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:11:37.643977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642118 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642122 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642126 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642129 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642132 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642136 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642139 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642142 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642145 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642148 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642153 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642156 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642159 2574 flags.go:64] FLAG: --config-dir="" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642162 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642165 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642176 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642179 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642182 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642186 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642189 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642192 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642195 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642198 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642201 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642205 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:11:37.644503 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642208 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642211 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642214 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642217 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642220 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642225 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642228 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642231 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642235 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642238 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642242 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642245 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642248 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642251 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642255 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642258 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642261 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642264 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642267 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642270 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642273 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642277 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642280 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642283 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642286 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642289 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:11:37.645120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642293 2574 flags.go:64] FLAG: --help="false" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642295 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-137-242.ec2.internal" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642299 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642302 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642305 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642308 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642311 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642314 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642317 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642320 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642323 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642327 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642330 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642333 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642336 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642339 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642342 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642345 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642348 2574 flags.go:64] FLAG: --lock-file="" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642351 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642354 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642357 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642362 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:11:37.645811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642365 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642368 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642371 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642374 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642377 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642380 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642383 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642387 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642390 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642394 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642397 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642400 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642404 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642407 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642410 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642413 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642415 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642423 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642426 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642429 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642433 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642436 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642441 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642444 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:11:37.646363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642448 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642463 2574 flags.go:64] FLAG: --port="10250" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642467 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642471 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09d66ac90a3363a91" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642474 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642477 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642481 2574 flags.go:64] FLAG: --register-node="true" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642484 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642487 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642490 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642493 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642496 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642499 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642503 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642507 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642510 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642513 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642515 2574 flags.go:64] FLAG: --runonce="false" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642518 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642522 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642525 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642527 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642530 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642533 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642536 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642539 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:11:37.647001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642542 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642546 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642549 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642552 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642556 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642559 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642562 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642569 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642571 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642574 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642579 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642583 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642585 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642588 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642591 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642594 2574 flags.go:64] FLAG: --v="2" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642599 2574 flags.go:64] FLAG: --version="false" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642603 2574 flags.go:64] FLAG: --vmodule="" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642607 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.642610 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642705 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642710 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642713 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642716 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:37.647648 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642719 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642722 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642724 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642727 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642730 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642733 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642736 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642738 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642741 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642744 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642746 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642749 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642752 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642755 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642758 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642760 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642763 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642765 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642768 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642770 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:37.648228 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642773 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642775 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642778 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642780 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642783 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642786 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642789 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642792 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642794 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642797 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642800 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642802 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642805 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642807 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642810 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642812 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642818 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642820 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642825 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:37.648786 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642831 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642834 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642837 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642840 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642842 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642845 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642848 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642851 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642854 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642857 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642859 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642862 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642865 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642867 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642870 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642872 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642875 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642877 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642880 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642883 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:37.649284 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642886 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642888 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642891 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642894 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642897 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642899 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642902 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642904 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642909 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642914 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642917 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642919 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642923 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642926 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642929 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642932 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642934 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642937 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642940 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:37.649791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642943 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:37.650257 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642945 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:37.650257 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642948 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:37.650257 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.642951 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:37.650257 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.643666 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:11:37.651085 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.651064 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:11:37.651117 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.651087 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:11:37.651144 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651134 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:37.651144 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651139 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:37.651144 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651143 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651146 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651150 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651154 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651156 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651159 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651162 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651165 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651169 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651173 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651177 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651180 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651183 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651186 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651189 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651192 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651195 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651198 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651201 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:37.651224 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651203 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651206 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651209 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651211 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651214 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651216 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651219 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651222 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651225 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651228 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651232 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651235 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651238 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651240 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651243 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651246 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651248 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651251 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651254 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651257 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:37.651706 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651259 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651262 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651264 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651267 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651269 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651272 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651275 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651277 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651280 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651283 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651285 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651287 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651290 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651293 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651297 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651299 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651302 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651305 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651309 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651312 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:37.652192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651315 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651318 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651320 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651324 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651327 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651330 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651332 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651334 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651337 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651340 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651342 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651345 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651347 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651350 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651353 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651355 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651358 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651360 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651363 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651366 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:37.652708 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651369 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651372 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651374 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651377 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651380 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.651385 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651494 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651499 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651502 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651505 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651508 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651510 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651513 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651515 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651518 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651521 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:37.653236 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651524 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651527 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651529 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651531 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651534 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651537 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651539 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651542 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651544 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651547 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651551 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651555 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651558 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651561 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651563 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651566 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651568 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651571 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651573 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:37.653674 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651576 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651578 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651581 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651583 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651586 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651589 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651591 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651594 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651597 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651599 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651602 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651604 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651607 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651609 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651612 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651615 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651617 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651620 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651623 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651625 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:37.654115 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651628 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651630 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651633 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651635 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651638 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651640 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651643 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651645 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651649 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651652 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651655 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651657 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651660 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651663 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651666 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651668 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651671 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651674 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651676 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651679 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:37.654622 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651681 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651684 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651687 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651689 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651692 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651694 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651697 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651700 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651703 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651705 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651708 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651711 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651714 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651716 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651719 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651722 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:37.655102 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:37.651725 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:37.655506 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.651730 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:11:37.655506 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.652631 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:11:37.655567 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.655537 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:11:37.656517 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.656505 2574 server.go:1019] "Starting client certificate rotation" Apr 16 13:11:37.656619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.656603 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:11:37.656650 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.656644 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:11:37.680203 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.680185 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:11:37.682564 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.682542 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:11:37.699063 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.699044 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:11:37.705073 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.705050 2574 log.go:25] "Validated CRI v1 image API" Apr 16 13:11:37.706684 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.706666 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:11:37.711412 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.711391 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a2bef8dd-2789-4927-87e1-1f73ddc6c1a7:/dev/nvme0n1p4 f1cd3729-0bef-4d7f-8683-6926dae0ed6e:/dev/nvme0n1p3] Apr 16 13:11:37.711506 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.711411 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:11:37.715233 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.715213 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:11:37.717262 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.717156 2574 manager.go:217] Machine: {Timestamp:2026-04-16 13:11:37.715053276 +0000 UTC m=+0.404985082 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099963 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d324af7eee7b9a5fe36fdf0f048f5 SystemUUID:ec2d324a-f7ee-e7b9-a5fe-36fdf0f048f5 BootID:4b69d544-7730-473b-b42b-9eb2c5c7fa2f Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:06:04:d1:8e:67 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:06:04:d1:8e:67 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:0b:a8:a6:07:47 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:11:37.717262 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.717257 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:11:37.717363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.717337 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:11:37.718426 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.718400 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:11:37.718594 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.718428 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-242.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:11:37.718650 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.718603 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:11:37.718650 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.718612 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:11:37.718650 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.718625 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:11:37.720095 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.720085 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:11:37.722006 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.721996 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:11:37.722106 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.722097 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:11:37.724601 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.724592 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:11:37.724634 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.724605 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:11:37.724634 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.724620 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:11:37.724634 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.724629 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:11:37.724740 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.724638 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:11:37.725823 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.725811 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:11:37.725861 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.725829 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:11:37.728837 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.728823 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:11:37.730575 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.730562 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:11:37.731972 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.731961 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:11:37.732011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.731977 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:11:37.732011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.731983 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:11:37.732011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.731989 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:11:37.732011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.731996 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:11:37.732011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.732005 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:11:37.732011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.732013 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:11:37.732162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.732019 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:11:37.732162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.732026 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:11:37.732162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.732032 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:11:37.732162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.732052 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:11:37.732162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.732079 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:11:37.732812 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.732800 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:11:37.732848 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.732815 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:11:37.736696 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.736682 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:11:37.736756 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.736720 2574 server.go:1295] "Started kubelet" Apr 16 13:11:37.736853 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.736832 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:11:37.736853 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.736816 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:11:37.736942 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.736880 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:11:37.736996 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.736978 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n5fn7" Apr 16 13:11:37.737440 ip-10-0-137-242 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:11:37.738235 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.738155 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:11:37.738682 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.738626 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-242.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:11:37.738682 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.738655 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-242.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:11:37.738774 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.738718 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:11:37.739600 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.739587 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:11:37.742999 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.742978 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n5fn7" Apr 16 13:11:37.747788 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.745554 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-242.ec2.internal.18a6d874751a2a5e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-242.ec2.internal,UID:ip-10-0-137-242.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-242.ec2.internal,},FirstTimestamp:2026-04-16 13:11:37.736694366 +0000 UTC m=+0.426626172,LastTimestamp:2026-04-16 13:11:37.736694366 +0000 UTC m=+0.426626172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-242.ec2.internal,}" Apr 16 13:11:37.749468 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.749430 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:11:37.749545 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.749488 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:11:37.749960 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.749945 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:11:37.750646 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.750627 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:11:37.750646 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.750630 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:11:37.750794 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.750656 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:11:37.750794 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.750766 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:11:37.750794 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.750778 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:11:37.750794 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.750775 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:37.750973 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.750815 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:11:37.750973 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.750824 2574 factory.go:55] Registering systemd factory Apr 16 13:11:37.750973 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.750833 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:11:37.751102 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.751032 2574 factory.go:153] Registering CRI-O factory Apr 16 13:11:37.751102 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.751047 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 13:11:37.751102 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.751066 2574 factory.go:103] Registering Raw factory Apr 16 13:11:37.751102 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.751075 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 13:11:37.751671 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.751646 2574 manager.go:319] Starting recovery of all containers Apr 16 13:11:37.759548 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.759500 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:37.762011 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.761885 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-242.ec2.internal\" not found" node="ip-10-0-137-242.ec2.internal" Apr 16 13:11:37.762237 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.762223 2574 manager.go:324] Recovery completed Apr 16 13:11:37.766162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.766146 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:37.769135 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.769120 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:37.769216 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.769172 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:37.769216 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.769185 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:37.769641 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.769629 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:11:37.769696 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.769640 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:11:37.769696 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.769656 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:11:37.772119 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.772107 2574 policy_none.go:49] "None policy: Start" Apr 16 13:11:37.772169 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.772124 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:11:37.772169 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.772135 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:11:37.817988 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.817975 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 13:11:37.818075 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.818006 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:11:37.818075 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.818016 2574 server.go:85] "Starting device plugin registration server" Apr 16 13:11:37.818258 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.818246 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:11:37.818305 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.818263 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:11:37.818469 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.818435 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:11:37.819374 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.818860 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:11:37.819374 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.818876 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:11:37.819700 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.819543 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:11:37.819700 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.819611 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:37.876577 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.876530 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:11:37.877762 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.877744 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:11:37.877824 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.877778 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:11:37.877824 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.877801 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:11:37.877824 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.877813 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:11:37.877933 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.877854 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:11:37.881922 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.881871 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:37.919200 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.919177 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:37.920113 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.920099 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:37.920185 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.920129 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:37.920185 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.920140 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:37.920185 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.920163 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-242.ec2.internal" Apr 16 13:11:37.927442 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.927427 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-242.ec2.internal" Apr 16 13:11:37.927508 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.927449 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-242.ec2.internal\": node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:37.939849 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:37.939829 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:37.978823 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.978794 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal"] Apr 16 13:11:37.978922 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.978866 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:37.979787 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.979772 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:37.979843 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.979802 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:37.979843 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.979811 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:37.981162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.981150 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:37.981324 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.981309 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" Apr 16 13:11:37.981369 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.981338 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:37.981885 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.981864 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:37.981931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.981889 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:37.981931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.981903 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:37.981931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.981919 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:37.982026 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.981935 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:37.982026 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.981948 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:37.983656 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.983640 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal" Apr 16 13:11:37.983743 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.983669 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:37.984529 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.984513 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:37.984602 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.984544 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:37.984602 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:37.984557 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:38.009178 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.009156 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-242.ec2.internal\" not found" node="ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.012670 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.012652 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-242.ec2.internal\" not found" node="ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.040267 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.040224 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.052647 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.052622 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c92fa02f9ecbd5a41fab97b5e3b3d073-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal\" (UID: \"c92fa02f9ecbd5a41fab97b5e3b3d073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.052722 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.052683 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c92fa02f9ecbd5a41fab97b5e3b3d073-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal\" (UID: \"c92fa02f9ecbd5a41fab97b5e3b3d073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.052722 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.052712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5a8bce9435bf9de24fab90842b9d921a-config\") pod \"kube-apiserver-proxy-ip-10-0-137-242.ec2.internal\" (UID: \"5a8bce9435bf9de24fab90842b9d921a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.140548 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.140434 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.153865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.153841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c92fa02f9ecbd5a41fab97b5e3b3d073-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal\" (UID: \"c92fa02f9ecbd5a41fab97b5e3b3d073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.153924 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.153869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c92fa02f9ecbd5a41fab97b5e3b3d073-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal\" (UID: \"c92fa02f9ecbd5a41fab97b5e3b3d073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.153924 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.153889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5a8bce9435bf9de24fab90842b9d921a-config\") pod \"kube-apiserver-proxy-ip-10-0-137-242.ec2.internal\" (UID: \"5a8bce9435bf9de24fab90842b9d921a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.153998 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.153928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c92fa02f9ecbd5a41fab97b5e3b3d073-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal\" (UID: \"c92fa02f9ecbd5a41fab97b5e3b3d073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.153998 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.153934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c92fa02f9ecbd5a41fab97b5e3b3d073-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal\" (UID: \"c92fa02f9ecbd5a41fab97b5e3b3d073\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.153998 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.153964 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5a8bce9435bf9de24fab90842b9d921a-config\") pod \"kube-apiserver-proxy-ip-10-0-137-242.ec2.internal\" (UID: \"5a8bce9435bf9de24fab90842b9d921a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.241249 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.241210 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.312786 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.312762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.315696 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.315678 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal" Apr 16 13:11:38.341871 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.341840 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.442409 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.442329 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.542934 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.542894 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.643400 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.643367 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.656765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.656748 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:11:38.656900 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.656885 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:11:38.656961 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.656930 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:11:38.744183 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.744159 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.745253 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.745227 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:06:37 +0000 UTC" deadline="2027-10-29 05:24:40.616469407 +0000 UTC" Apr 16 13:11:38.745326 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.745255 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13456h13m1.871218858s" Apr 16 13:11:38.749983 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.749961 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:11:38.759925 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.759905 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:11:38.783285 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.783253 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6w9sd" Apr 16 13:11:38.788357 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.788339 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6w9sd" Apr 16 13:11:38.804000 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:38.803967 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a8bce9435bf9de24fab90842b9d921a.slice/crio-f6b11d8616843b4081a3eda8ac418d772ed2187140f22367c6556ce370482483 WatchSource:0}: Error finding container f6b11d8616843b4081a3eda8ac418d772ed2187140f22367c6556ce370482483: Status 404 returned error can't find the container with id f6b11d8616843b4081a3eda8ac418d772ed2187140f22367c6556ce370482483 Apr 16 13:11:38.804358 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:38.804338 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc92fa02f9ecbd5a41fab97b5e3b3d073.slice/crio-6bb90ca507ce1e31c0b55bdfd81acc5f4cfc1437878f26d83430f0c14b6658c4 WatchSource:0}: Error finding container 6bb90ca507ce1e31c0b55bdfd81acc5f4cfc1437878f26d83430f0c14b6658c4: Status 404 returned error can't find the container with id 6bb90ca507ce1e31c0b55bdfd81acc5f4cfc1437878f26d83430f0c14b6658c4 Apr 16 13:11:38.809534 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.809518 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:11:38.844899 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.844868 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.881380 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.881335 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" event={"ID":"c92fa02f9ecbd5a41fab97b5e3b3d073","Type":"ContainerStarted","Data":"6bb90ca507ce1e31c0b55bdfd81acc5f4cfc1437878f26d83430f0c14b6658c4"} Apr 16 13:11:38.882323 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.882300 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal" event={"ID":"5a8bce9435bf9de24fab90842b9d921a","Type":"ContainerStarted","Data":"f6b11d8616843b4081a3eda8ac418d772ed2187140f22367c6556ce370482483"} Apr 16 13:11:38.945569 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:38.945543 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-242.ec2.internal\" not found" Apr 16 13:11:38.994873 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:38.994796 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:39.051003 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.050977 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal" Apr 16 13:11:39.059459 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.059440 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:11:39.061134 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.061121 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" Apr 16 13:11:39.070446 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.070427 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:11:39.170228 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.170199 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:39.495142 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.495110 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:39.644570 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.644537 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:39.725432 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.725389 2574 apiserver.go:52] "Watching apiserver" Apr 16 13:11:39.732310 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.732251 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:11:39.732676 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.732653 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-5nl2m","kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal","openshift-cluster-node-tuning-operator/tuned-mqpkt","openshift-dns/node-resolver-rcb6c","openshift-multus/multus-additional-cni-plugins-lcmf6","openshift-network-diagnostics/network-check-target-vcxvp","openshift-ovn-kubernetes/ovnkube-node-xp25r","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px","openshift-image-registry/node-ca-j7cl9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal","openshift-multus/multus-h97bc","openshift-multus/network-metrics-daemon-shf7m","openshift-network-operator/iptables-alerter-hqlgp"] Apr 16 13:11:39.735208 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.735182 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.735320 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.735275 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.738045 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.738018 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:11:39.738319 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.738298 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:39.738421 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:39.738392 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:39.738505 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.738402 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xq9cv\"" Apr 16 13:11:39.739102 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.739078 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:11:39.739102 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.739093 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-72fwf\"" Apr 16 13:11:39.739230 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.739117 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:11:39.739230 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.739140 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:11:39.739230 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.739202 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:11:39.739230 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.739222 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:11:39.739695 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.739674 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.740912 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.740863 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.741011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.740965 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:39.741069 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:39.741016 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:39.741920 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.741904 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:11:39.742190 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.742173 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:11:39.742384 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.742369 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nfhhc\"" Apr 16 13:11:39.742824 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.742808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:39.743298 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.743282 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:11:39.743381 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.743322 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:11:39.743844 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.743609 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:11:39.743844 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.743648 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:11:39.743844 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.743699 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d598s\"" Apr 16 13:11:39.744144 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.744129 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.745555 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.745508 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:11:39.745555 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.745526 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:11:39.745693 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.745556 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9d5jt\"" Apr 16 13:11:39.746450 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.746431 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:11:39.746817 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.746790 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:11:39.746817 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.746814 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:11:39.746954 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.746864 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p8gp2\"" Apr 16 13:11:39.747254 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.747237 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.747336 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.747272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.748950 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.748929 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.749877 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.749861 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mmsmj\"" Apr 16 13:11:39.749974 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.749886 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:11:39.749974 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.749923 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:11:39.749974 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.749941 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f5965\"" Apr 16 13:11:39.749974 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.749950 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:11:39.750239 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.750223 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:11:39.750295 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.750279 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:11:39.750372 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.750356 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:11:39.750426 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.750402 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:11:39.750644 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.750630 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:11:39.751154 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.751137 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pn8p4\"" Apr 16 13:11:39.751375 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.751357 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:11:39.751541 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.751394 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:11:39.751687 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.751659 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:11:39.762535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-kubelet\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.762638 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f731cddc-60f5-4b83-954c-670a96ec5019-iptables-alerter-script\") pod \"iptables-alerter-hqlgp\" (UID: \"f731cddc-60f5-4b83-954c-670a96ec5019\") " pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.762638 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-var-lib-kubelet\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.762638 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762622 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-device-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.762800 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-var-lib-cni-bin\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.762800 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762674 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-conf-dir\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.762800 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762713 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-run-systemd\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.762800 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762731 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-etc-openvswitch\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.762800 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762752 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-run-ovn\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.762800 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rfld\" (UniqueName: \"kubernetes.io/projected/89d8b508-8fec-42a0-b382-7b625093ae71-kube-api-access-8rfld\") pod \"node-ca-j7cl9\" (UID: \"89d8b508-8fec-42a0-b382-7b625093ae71\") " pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.762800 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a278bcef-bcae-4a88-80bc-8c4850ab20b1-agent-certs\") pod \"konnectivity-agent-5nl2m\" (UID: \"a278bcef-bcae-4a88-80bc-8c4850ab20b1\") " pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762814 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-run-netns\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762834 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f6415fd-c655-4953-99ef-9575b096b3e2-tmp-dir\") pod \"node-resolver-rcb6c\" (UID: \"2f6415fd-c655-4953-99ef-9575b096b3e2\") " pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762849 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89d8b508-8fec-42a0-b382-7b625093ae71-host\") pod \"node-ca-j7cl9\" (UID: \"89d8b508-8fec-42a0-b382-7b625093ae71\") " pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-hostroot\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-system-cni-dir\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de1ece58-c2c4-43a2-9674-0ba69c468ab3-cni-binary-copy\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-systemd\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762949 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762965 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-os-release\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.762980 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-tmp\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763001 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-etc-selinux\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763040 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-os-release\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-run\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-socket-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.763176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6hj6\" (UniqueName: \"kubernetes.io/projected/2f6415fd-c655-4953-99ef-9575b096b3e2-kube-api-access-x6hj6\") pod \"node-resolver-rcb6c\" (UID: \"2f6415fd-c655-4953-99ef-9575b096b3e2\") " pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-run-openvswitch\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763229 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-log-socket\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763255 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-cni-dir\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763304 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-kubernetes\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-sysctl-conf\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-slash\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-cni-netd\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763420 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-cnibin\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-sysconfig\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-systemd-units\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-system-cni-dir\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-run-k8s-cni-cncf-io\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763593 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-daemon-config\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.763880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-tuned\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763630 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-cni-bin\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763643 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/abe3d854-132c-4011-9672-f19aca1e1f71-ovn-node-metrics-cert\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763658 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/de1ece58-c2c4-43a2-9674-0ba69c468ab3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/de1ece58-c2c4-43a2-9674-0ba69c468ab3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p98r\" (UniqueName: \"kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r\") pod \"network-check-target-vcxvp\" (UID: \"249e62bb-dc30-4b99-98be-87bddf8915ee\") " pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763725 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-run-netns\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763755 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f6415fd-c655-4953-99ef-9575b096b3e2-hosts-file\") pod \"node-resolver-rcb6c\" (UID: \"2f6415fd-c655-4953-99ef-9575b096b3e2\") " pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763797 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk69z\" (UniqueName: \"kubernetes.io/projected/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-kube-api-access-tk69z\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763825 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f795d25-9d1f-4d82-807d-5a1c84b53f53-cni-binary-copy\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7tjc\" (UniqueName: \"kubernetes.io/projected/de1ece58-c2c4-43a2-9674-0ba69c468ab3-kube-api-access-p7tjc\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763885 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a278bcef-bcae-4a88-80bc-8c4850ab20b1-konnectivity-ca\") pod \"konnectivity-agent-5nl2m\" (UID: \"a278bcef-bcae-4a88-80bc-8c4850ab20b1\") " pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-registration-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptldn\" (UniqueName: \"kubernetes.io/projected/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-kube-api-access-ptldn\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zzxj\" (UniqueName: \"kubernetes.io/projected/abe3d854-132c-4011-9672-f19aca1e1f71-kube-api-access-7zzxj\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.763979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-sys\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.764766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/abe3d854-132c-4011-9672-f19aca1e1f71-env-overrides\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/abe3d854-132c-4011-9672-f19aca1e1f71-ovnkube-script-lib\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764079 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/89d8b508-8fec-42a0-b382-7b625093ae71-serviceca\") pod \"node-ca-j7cl9\" (UID: \"89d8b508-8fec-42a0-b382-7b625093ae71\") " pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f731cddc-60f5-4b83-954c-670a96ec5019-host-slash\") pod \"iptables-alerter-hqlgp\" (UID: \"f731cddc-60f5-4b83-954c-670a96ec5019\") " pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764122 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-run-multus-certs\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764145 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-lib-modules\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-host\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-var-lib-openvswitch\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764240 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-node-log\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764275 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-cnibin\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-socket-dir-parent\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-var-lib-cni-multus\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764400 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kztck\" (UniqueName: \"kubernetes.io/projected/0f795d25-9d1f-4d82-807d-5a1c84b53f53-kube-api-access-kztck\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764425 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-run-ovn-kubernetes\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2wzq\" (UniqueName: \"kubernetes.io/projected/f731cddc-60f5-4b83-954c-670a96ec5019-kube-api-access-j2wzq\") pod \"iptables-alerter-hqlgp\" (UID: \"f731cddc-60f5-4b83-954c-670a96ec5019\") " pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764498 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-etc-kubernetes\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764543 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-var-lib-kubelet\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.765441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/abe3d854-132c-4011-9672-f19aca1e1f71-ovnkube-config\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.766175 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764606 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-modprobe-d\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.766175 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-sysctl-d\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.766175 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764659 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgjbn\" (UniqueName: \"kubernetes.io/projected/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-kube-api-access-wgjbn\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.766175 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.764682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-sys-fs\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.788965 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.788933 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:06:38 +0000 UTC" deadline="2028-01-10 09:52:25.26284425 +0000 UTC" Apr 16 13:11:39.789057 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.788966 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15212h40m45.473881937s" Apr 16 13:11:39.865950 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.865917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-run-ovn-kubernetes\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.865950 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.865960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2wzq\" (UniqueName: \"kubernetes.io/projected/f731cddc-60f5-4b83-954c-670a96ec5019-kube-api-access-j2wzq\") pod \"iptables-alerter-hqlgp\" (UID: \"f731cddc-60f5-4b83-954c-670a96ec5019\") " pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.866194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.865980 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-etc-kubernetes\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.866194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866021 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-etc-kubernetes\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.866194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866033 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-run-ovn-kubernetes\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-var-lib-kubelet\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.866194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/abe3d854-132c-4011-9672-f19aca1e1f71-ovnkube-config\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-modprobe-d\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.866194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-sysctl-d\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.866194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgjbn\" (UniqueName: \"kubernetes.io/projected/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-kube-api-access-wgjbn\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-var-lib-kubelet\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-modprobe-d\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-sysctl-d\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-sys-fs\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-kubelet\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866450 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f731cddc-60f5-4b83-954c-670a96ec5019-iptables-alerter-script\") pod \"iptables-alerter-hqlgp\" (UID: \"f731cddc-60f5-4b83-954c-670a96ec5019\") " pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-var-lib-kubelet\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-kubelet\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-device-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-sys-fs\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-var-lib-cni-bin\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-var-lib-kubelet\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.866595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-conf-dir\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-run-systemd\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866608 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-device-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866631 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-etc-openvswitch\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-conf-dir\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-run-ovn\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866665 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-run-systemd\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rfld\" (UniqueName: \"kubernetes.io/projected/89d8b508-8fec-42a0-b382-7b625093ae71-kube-api-access-8rfld\") pod \"node-ca-j7cl9\" (UID: \"89d8b508-8fec-42a0-b382-7b625093ae71\") " pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866695 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-run-ovn\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/abe3d854-132c-4011-9672-f19aca1e1f71-ovnkube-config\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866853 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-etc-openvswitch\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a278bcef-bcae-4a88-80bc-8c4850ab20b1-agent-certs\") pod \"konnectivity-agent-5nl2m\" (UID: \"a278bcef-bcae-4a88-80bc-8c4850ab20b1\") " pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-var-lib-cni-bin\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-run-netns\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f6415fd-c655-4953-99ef-9575b096b3e2-tmp-dir\") pod \"node-resolver-rcb6c\" (UID: \"2f6415fd-c655-4953-99ef-9575b096b3e2\") " pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89d8b508-8fec-42a0-b382-7b625093ae71-host\") pod \"node-ca-j7cl9\" (UID: \"89d8b508-8fec-42a0-b382-7b625093ae71\") " pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.866969 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-hostroot\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866987 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-system-cni-dir\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.866993 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867012 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/89d8b508-8fec-42a0-b382-7b625093ae71-host\") pod \"node-ca-j7cl9\" (UID: \"89d8b508-8fec-42a0-b382-7b625093ae71\") " pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de1ece58-c2c4-43a2-9674-0ba69c468ab3-cni-binary-copy\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867036 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-run-netns\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-systemd\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867056 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-system-cni-dir\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867027 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f731cddc-60f5-4b83-954c-670a96ec5019-iptables-alerter-script\") pod \"iptables-alerter-hqlgp\" (UID: \"f731cddc-60f5-4b83-954c-670a96ec5019\") " pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-hostroot\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867097 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-os-release\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867111 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-tmp\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867120 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-systemd\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867131 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-etc-selinux\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867178 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-os-release\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.867669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867203 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-run\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-socket-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2f6415fd-c655-4953-99ef-9575b096b3e2-tmp-dir\") pod \"node-resolver-rcb6c\" (UID: \"2f6415fd-c655-4953-99ef-9575b096b3e2\") " pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-etc-selinux\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6hj6\" (UniqueName: \"kubernetes.io/projected/2f6415fd-c655-4953-99ef-9575b096b3e2-kube-api-access-x6hj6\") pod \"node-resolver-rcb6c\" (UID: \"2f6415fd-c655-4953-99ef-9575b096b3e2\") " pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-run-openvswitch\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-run\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867305 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-os-release\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-os-release\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-log-socket\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867370 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-socket-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:39.867392 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867409 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-cni-dir\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867424 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-run-openvswitch\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-kubernetes\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-sysctl-conf\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.868498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-kubernetes\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:39.867521 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs podName:aca4cb2c-c97a-48d4-a12d-3316ad904fc3 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:40.367488799 +0000 UTC m=+3.057420609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs") pod "network-metrics-daemon-shf7m" (UID: "aca4cb2c-c97a-48d4-a12d-3316ad904fc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867545 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-sysctl-conf\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867626 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-log-socket\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-slash\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867662 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-cni-dir\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-cni-netd\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867708 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-cni-netd\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-slash\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867781 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-cnibin\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867834 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-sysconfig\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-systemd-units\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867885 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-system-cni-dir\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-run-k8s-cni-cncf-io\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867933 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-cnibin\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.869249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-daemon-config\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.867975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-tuned\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-cni-bin\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/abe3d854-132c-4011-9672-f19aca1e1f71-ovn-node-metrics-cert\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/de1ece58-c2c4-43a2-9674-0ba69c468ab3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/de1ece58-c2c4-43a2-9674-0ba69c468ab3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p98r\" (UniqueName: \"kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r\") pod \"network-check-target-vcxvp\" (UID: \"249e62bb-dc30-4b99-98be-87bddf8915ee\") " pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-run-netns\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f6415fd-c655-4953-99ef-9575b096b3e2-hosts-file\") pod \"node-resolver-rcb6c\" (UID: \"2f6415fd-c655-4953-99ef-9575b096b3e2\") " pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868227 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk69z\" (UniqueName: \"kubernetes.io/projected/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-kube-api-access-tk69z\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f795d25-9d1f-4d82-807d-5a1c84b53f53-cni-binary-copy\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/de1ece58-c2c4-43a2-9674-0ba69c468ab3-cni-binary-copy\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868281 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7tjc\" (UniqueName: \"kubernetes.io/projected/de1ece58-c2c4-43a2-9674-0ba69c468ab3-kube-api-access-p7tjc\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-systemd-units\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868148 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-host-cni-bin\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868348 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a278bcef-bcae-4a88-80bc-8c4850ab20b1-konnectivity-ca\") pod \"konnectivity-agent-5nl2m\" (UID: \"a278bcef-bcae-4a88-80bc-8c4850ab20b1\") " pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868376 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-registration-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.870055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptldn\" (UniqueName: \"kubernetes.io/projected/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-kube-api-access-ptldn\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zzxj\" (UniqueName: \"kubernetes.io/projected/abe3d854-132c-4011-9672-f19aca1e1f71-kube-api-access-7zzxj\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868449 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-sys\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868495 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/de1ece58-c2c4-43a2-9674-0ba69c468ab3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/abe3d854-132c-4011-9672-f19aca1e1f71-env-overrides\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868534 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/abe3d854-132c-4011-9672-f19aca1e1f71-ovnkube-script-lib\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/89d8b508-8fec-42a0-b382-7b625093ae71-serviceca\") pod \"node-ca-j7cl9\" (UID: \"89d8b508-8fec-42a0-b382-7b625093ae71\") " pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868566 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2f6415fd-c655-4953-99ef-9575b096b3e2-hosts-file\") pod \"node-resolver-rcb6c\" (UID: \"2f6415fd-c655-4953-99ef-9575b096b3e2\") " pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868587 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f731cddc-60f5-4b83-954c-670a96ec5019-host-slash\") pod \"iptables-alerter-hqlgp\" (UID: \"f731cddc-60f5-4b83-954c-670a96ec5019\") " pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f731cddc-60f5-4b83-954c-670a96ec5019-host-slash\") pod \"iptables-alerter-hqlgp\" (UID: \"f731cddc-60f5-4b83-954c-670a96ec5019\") " pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868676 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-sysconfig\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-run-netns\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868819 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-run-multus-certs\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-lib-modules\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-host\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/de1ece58-c2c4-43a2-9674-0ba69c468ab3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-var-lib-openvswitch\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-node-log\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.870765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-cnibin\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868971 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-socket-dir-parent\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.868995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-var-lib-cni-multus\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kztck\" (UniqueName: \"kubernetes.io/projected/0f795d25-9d1f-4d82-807d-5a1c84b53f53-kube-api-access-kztck\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869071 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a278bcef-bcae-4a88-80bc-8c4850ab20b1-konnectivity-ca\") pod \"konnectivity-agent-5nl2m\" (UID: \"a278bcef-bcae-4a88-80bc-8c4850ab20b1\") " pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-system-cni-dir\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-registration-dir\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869253 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-run-multus-certs\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/de1ece58-c2c4-43a2-9674-0ba69c468ab3-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-sys\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869344 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-lib-modules\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-node-log\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-cnibin\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-host\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-socket-dir-parent\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/abe3d854-132c-4011-9672-f19aca1e1f71-var-lib-openvswitch\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-var-lib-cni-multus\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f795d25-9d1f-4d82-807d-5a1c84b53f53-cni-binary-copy\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.871678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869650 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0f795d25-9d1f-4d82-807d-5a1c84b53f53-host-run-k8s-cni-cncf-io\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.872530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/abe3d854-132c-4011-9672-f19aca1e1f71-env-overrides\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.872530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.869922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/89d8b508-8fec-42a0-b382-7b625093ae71-serviceca\") pod \"node-ca-j7cl9\" (UID: \"89d8b508-8fec-42a0-b382-7b625093ae71\") " pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.872530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.870013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/abe3d854-132c-4011-9672-f19aca1e1f71-ovnkube-script-lib\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.872530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.870081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0f795d25-9d1f-4d82-807d-5a1c84b53f53-multus-daemon-config\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.872530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.870581 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-tmp\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.872530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.870859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-etc-tuned\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.872530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.871676 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/abe3d854-132c-4011-9672-f19aca1e1f71-ovn-node-metrics-cert\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.872530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.871715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a278bcef-bcae-4a88-80bc-8c4850ab20b1-agent-certs\") pod \"konnectivity-agent-5nl2m\" (UID: \"a278bcef-bcae-4a88-80bc-8c4850ab20b1\") " pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:39.874481 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.874301 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgjbn\" (UniqueName: \"kubernetes.io/projected/d9c9aa14-1a19-474d-89b7-c6b76aed5c4a-kube-api-access-wgjbn\") pod \"tuned-mqpkt\" (UID: \"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a\") " pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:39.874481 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.874413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2wzq\" (UniqueName: \"kubernetes.io/projected/f731cddc-60f5-4b83-954c-670a96ec5019-kube-api-access-j2wzq\") pod \"iptables-alerter-hqlgp\" (UID: \"f731cddc-60f5-4b83-954c-670a96ec5019\") " pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:39.879200 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.878859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rfld\" (UniqueName: \"kubernetes.io/projected/89d8b508-8fec-42a0-b382-7b625093ae71-kube-api-access-8rfld\") pod \"node-ca-j7cl9\" (UID: \"89d8b508-8fec-42a0-b382-7b625093ae71\") " pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:39.879200 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.878866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6hj6\" (UniqueName: \"kubernetes.io/projected/2f6415fd-c655-4953-99ef-9575b096b3e2-kube-api-access-x6hj6\") pod \"node-resolver-rcb6c\" (UID: \"2f6415fd-c655-4953-99ef-9575b096b3e2\") " pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:39.879200 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:39.878924 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:39.879200 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:39.878971 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:39.879200 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:39.878984 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5p98r for pod openshift-network-diagnostics/network-check-target-vcxvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:39.879200 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:39.879051 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r podName:249e62bb-dc30-4b99-98be-87bddf8915ee nodeName:}" failed. No retries permitted until 2026-04-16 13:11:40.379032682 +0000 UTC m=+3.068964478 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5p98r" (UniqueName: "kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r") pod "network-check-target-vcxvp" (UID: "249e62bb-dc30-4b99-98be-87bddf8915ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:39.883565 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.881207 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk69z\" (UniqueName: \"kubernetes.io/projected/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-kube-api-access-tk69z\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:39.883565 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.881878 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7tjc\" (UniqueName: \"kubernetes.io/projected/de1ece58-c2c4-43a2-9674-0ba69c468ab3-kube-api-access-p7tjc\") pod \"multus-additional-cni-plugins-lcmf6\" (UID: \"de1ece58-c2c4-43a2-9674-0ba69c468ab3\") " pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:39.883565 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.882181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kztck\" (UniqueName: \"kubernetes.io/projected/0f795d25-9d1f-4d82-807d-5a1c84b53f53-kube-api-access-kztck\") pod \"multus-h97bc\" (UID: \"0f795d25-9d1f-4d82-807d-5a1c84b53f53\") " pod="openshift-multus/multus-h97bc" Apr 16 13:11:39.885128 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.885106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zzxj\" (UniqueName: \"kubernetes.io/projected/abe3d854-132c-4011-9672-f19aca1e1f71-kube-api-access-7zzxj\") pod \"ovnkube-node-xp25r\" (UID: \"abe3d854-132c-4011-9672-f19aca1e1f71\") " pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:39.885219 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:39.885185 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptldn\" (UniqueName: \"kubernetes.io/projected/07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865-kube-api-access-ptldn\") pod \"aws-ebs-csi-driver-node-g59px\" (UID: \"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:40.048719 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.048639 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-j7cl9" Apr 16 13:11:40.057567 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.057541 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hqlgp" Apr 16 13:11:40.067255 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.067221 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" Apr 16 13:11:40.073178 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.073151 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h97bc" Apr 16 13:11:40.080822 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.080799 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:40.088552 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.088522 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" Apr 16 13:11:40.095226 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.095200 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" Apr 16 13:11:40.101959 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.101933 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:11:40.107629 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.107603 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-rcb6c" Apr 16 13:11:40.366216 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:40.366189 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89d8b508_8fec_42a0_b382_7b625093ae71.slice/crio-3bb92253db9477d4b37046ebdcc0414677308083a941541eb74720376bcd2f3a WatchSource:0}: Error finding container 3bb92253db9477d4b37046ebdcc0414677308083a941541eb74720376bcd2f3a: Status 404 returned error can't find the container with id 3bb92253db9477d4b37046ebdcc0414677308083a941541eb74720376bcd2f3a Apr 16 13:11:40.367349 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:40.367321 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9c9aa14_1a19_474d_89b7_c6b76aed5c4a.slice/crio-1e30248f613256e838728ded5eb47060ea1db559e1c81b681e5f3cd7507c36e8 WatchSource:0}: Error finding container 1e30248f613256e838728ded5eb47060ea1db559e1c81b681e5f3cd7507c36e8: Status 404 returned error can't find the container with id 1e30248f613256e838728ded5eb47060ea1db559e1c81b681e5f3cd7507c36e8 Apr 16 13:11:40.368158 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:40.368133 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde1ece58_c2c4_43a2_9674_0ba69c468ab3.slice/crio-8f80304ac32ba536fcaecd30afa656140f94f906a7eede9ffa70ec6a36b9b635 WatchSource:0}: Error finding container 8f80304ac32ba536fcaecd30afa656140f94f906a7eede9ffa70ec6a36b9b635: Status 404 returned error can't find the container with id 8f80304ac32ba536fcaecd30afa656140f94f906a7eede9ffa70ec6a36b9b635 Apr 16 13:11:40.368988 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:40.368960 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf731cddc_60f5_4b83_954c_670a96ec5019.slice/crio-fa2ee99bb57d9a3cd828c9a685c5c0f56f4ac1e88d269a2c304da4a206a6ecce WatchSource:0}: Error finding container fa2ee99bb57d9a3cd828c9a685c5c0f56f4ac1e88d269a2c304da4a206a6ecce: Status 404 returned error can't find the container with id fa2ee99bb57d9a3cd828c9a685c5c0f56f4ac1e88d269a2c304da4a206a6ecce Apr 16 13:11:40.370687 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:40.370664 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f795d25_9d1f_4d82_807d_5a1c84b53f53.slice/crio-eaed48f5c52562ad106f16da4e5c68e353375c0b73f579d0f39947b005981bf1 WatchSource:0}: Error finding container eaed48f5c52562ad106f16da4e5c68e353375c0b73f579d0f39947b005981bf1: Status 404 returned error can't find the container with id eaed48f5c52562ad106f16da4e5c68e353375c0b73f579d0f39947b005981bf1 Apr 16 13:11:40.371865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.371847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:40.372008 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:40.371956 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:40.372124 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:40.372019 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs podName:aca4cb2c-c97a-48d4-a12d-3316ad904fc3 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:41.371995 +0000 UTC m=+4.061926796 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs") pod "network-metrics-daemon-shf7m" (UID: "aca4cb2c-c97a-48d4-a12d-3316ad904fc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:40.373740 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:40.373716 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f6415fd_c655_4953_99ef_9575b096b3e2.slice/crio-9c5d0863dfa7cee524254893229cbf8acb5a7c8d10c9305e2b258e273a39a3fd WatchSource:0}: Error finding container 9c5d0863dfa7cee524254893229cbf8acb5a7c8d10c9305e2b258e273a39a3fd: Status 404 returned error can't find the container with id 9c5d0863dfa7cee524254893229cbf8acb5a7c8d10c9305e2b258e273a39a3fd Apr 16 13:11:40.374897 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:11:40.374874 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabe3d854_132c_4011_9672_f19aca1e1f71.slice/crio-430a5b3129680ad1a014897e8b962b77416ee1c1c06d7ec6ea087de58ca67cbe WatchSource:0}: Error finding container 430a5b3129680ad1a014897e8b962b77416ee1c1c06d7ec6ea087de58ca67cbe: Status 404 returned error can't find the container with id 430a5b3129680ad1a014897e8b962b77416ee1c1c06d7ec6ea087de58ca67cbe Apr 16 13:11:40.473007 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.472976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p98r\" (UniqueName: \"kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r\") pod \"network-check-target-vcxvp\" (UID: \"249e62bb-dc30-4b99-98be-87bddf8915ee\") " pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:40.473153 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:40.473132 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:40.473211 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:40.473159 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:40.473211 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:40.473180 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5p98r for pod openshift-network-diagnostics/network-check-target-vcxvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:40.473318 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:40.473242 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r podName:249e62bb-dc30-4b99-98be-87bddf8915ee nodeName:}" failed. No retries permitted until 2026-04-16 13:11:41.473216349 +0000 UTC m=+4.163148162 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5p98r" (UniqueName: "kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r") pod "network-check-target-vcxvp" (UID: "249e62bb-dc30-4b99-98be-87bddf8915ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:40.792919 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.792799 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:06:38 +0000 UTC" deadline="2027-09-23 13:21:18.992533479 +0000 UTC" Apr 16 13:11:40.792919 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.792842 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12600h9m38.199696518s" Apr 16 13:11:40.878200 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.878171 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:40.878377 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:40.878276 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:40.895872 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.895801 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" event={"ID":"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865","Type":"ContainerStarted","Data":"e8837411f251978611dcb00699a0e0d4efea39ead47de6314a6c9de0485b4fdc"} Apr 16 13:11:40.898669 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.898606 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h97bc" event={"ID":"0f795d25-9d1f-4d82-807d-5a1c84b53f53","Type":"ContainerStarted","Data":"eaed48f5c52562ad106f16da4e5c68e353375c0b73f579d0f39947b005981bf1"} Apr 16 13:11:40.908351 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.908322 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" event={"ID":"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a","Type":"ContainerStarted","Data":"1e30248f613256e838728ded5eb47060ea1db559e1c81b681e5f3cd7507c36e8"} Apr 16 13:11:40.916675 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.916647 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hqlgp" event={"ID":"f731cddc-60f5-4b83-954c-670a96ec5019","Type":"ContainerStarted","Data":"fa2ee99bb57d9a3cd828c9a685c5c0f56f4ac1e88d269a2c304da4a206a6ecce"} Apr 16 13:11:40.918690 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.918665 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j7cl9" event={"ID":"89d8b508-8fec-42a0-b382-7b625093ae71","Type":"ContainerStarted","Data":"3bb92253db9477d4b37046ebdcc0414677308083a941541eb74720376bcd2f3a"} Apr 16 13:11:40.930797 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.930769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerStarted","Data":"430a5b3129680ad1a014897e8b962b77416ee1c1c06d7ec6ea087de58ca67cbe"} Apr 16 13:11:40.933253 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.933226 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rcb6c" event={"ID":"2f6415fd-c655-4953-99ef-9575b096b3e2","Type":"ContainerStarted","Data":"9c5d0863dfa7cee524254893229cbf8acb5a7c8d10c9305e2b258e273a39a3fd"} Apr 16 13:11:40.937426 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.937403 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" event={"ID":"de1ece58-c2c4-43a2-9674-0ba69c468ab3","Type":"ContainerStarted","Data":"8f80304ac32ba536fcaecd30afa656140f94f906a7eede9ffa70ec6a36b9b635"} Apr 16 13:11:40.943555 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.943527 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal" event={"ID":"5a8bce9435bf9de24fab90842b9d921a","Type":"ContainerStarted","Data":"94a052f45858f8bb061f2154e5a31005c4536ae9a6145d1e66201b7e60a680b4"} Apr 16 13:11:40.946334 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:40.946310 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5nl2m" event={"ID":"a278bcef-bcae-4a88-80bc-8c4850ab20b1","Type":"ContainerStarted","Data":"b60d006e06fd8e1c46c0ec939f8cd8f4509449262eb1ab15023df2fb135ff4d2"} Apr 16 13:11:41.380105 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:41.379515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:41.380105 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:41.379684 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:41.380105 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:41.379748 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs podName:aca4cb2c-c97a-48d4-a12d-3316ad904fc3 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:43.379730467 +0000 UTC m=+6.069662265 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs") pod "network-metrics-daemon-shf7m" (UID: "aca4cb2c-c97a-48d4-a12d-3316ad904fc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:41.481047 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:41.480438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p98r\" (UniqueName: \"kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r\") pod \"network-check-target-vcxvp\" (UID: \"249e62bb-dc30-4b99-98be-87bddf8915ee\") " pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:41.481047 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:41.480604 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:41.481047 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:41.480625 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:41.481047 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:41.480647 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5p98r for pod openshift-network-diagnostics/network-check-target-vcxvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:41.481047 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:41.480707 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r podName:249e62bb-dc30-4b99-98be-87bddf8915ee nodeName:}" failed. No retries permitted until 2026-04-16 13:11:43.480685356 +0000 UTC m=+6.170617163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5p98r" (UniqueName: "kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r") pod "network-check-target-vcxvp" (UID: "249e62bb-dc30-4b99-98be-87bddf8915ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:41.881909 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:41.881389 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:41.881909 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:41.881534 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:41.959133 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:41.958025 2574 generic.go:358] "Generic (PLEG): container finished" podID="c92fa02f9ecbd5a41fab97b5e3b3d073" containerID="d0d3ce263c3a7029488845610e1958bf5a3cf7a3b21cb53f9a5b46cac63b4725" exitCode=0 Apr 16 13:11:41.959133 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:41.959085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" event={"ID":"c92fa02f9ecbd5a41fab97b5e3b3d073","Type":"ContainerDied","Data":"d0d3ce263c3a7029488845610e1958bf5a3cf7a3b21cb53f9a5b46cac63b4725"} Apr 16 13:11:41.972916 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:41.972866 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-242.ec2.internal" podStartSLOduration=2.9728496140000003 podStartE2EDuration="2.972849614s" podCreationTimestamp="2026-04-16 13:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:11:40.957567666 +0000 UTC m=+3.647499483" watchObservedRunningTime="2026-04-16 13:11:41.972849614 +0000 UTC m=+4.662781431" Apr 16 13:11:42.878550 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:42.878514 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:42.878723 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:42.878660 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:42.973408 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:42.973371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" event={"ID":"c92fa02f9ecbd5a41fab97b5e3b3d073","Type":"ContainerStarted","Data":"c5db6e28d0f43a8dcb27152dca8957e6e13cf35db2dfa1d6ef1ebd31cc058a7f"} Apr 16 13:11:42.988195 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:42.987127 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-242.ec2.internal" podStartSLOduration=3.9871085429999997 podStartE2EDuration="3.987108543s" podCreationTimestamp="2026-04-16 13:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:11:42.98699411 +0000 UTC m=+5.676925928" watchObservedRunningTime="2026-04-16 13:11:42.987108543 +0000 UTC m=+5.677040361" Apr 16 13:11:43.397044 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:43.396995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:43.397279 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:43.397190 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:43.397279 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:43.397257 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs podName:aca4cb2c-c97a-48d4-a12d-3316ad904fc3 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:47.397238077 +0000 UTC m=+10.087169882 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs") pod "network-metrics-daemon-shf7m" (UID: "aca4cb2c-c97a-48d4-a12d-3316ad904fc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:43.498237 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:43.497614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p98r\" (UniqueName: \"kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r\") pod \"network-check-target-vcxvp\" (UID: \"249e62bb-dc30-4b99-98be-87bddf8915ee\") " pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:43.498237 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:43.497785 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:43.498237 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:43.497808 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:43.498237 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:43.497821 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5p98r for pod openshift-network-diagnostics/network-check-target-vcxvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:43.498237 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:43.497880 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r podName:249e62bb-dc30-4b99-98be-87bddf8915ee nodeName:}" failed. No retries permitted until 2026-04-16 13:11:47.497860584 +0000 UTC m=+10.187792392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5p98r" (UniqueName: "kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r") pod "network-check-target-vcxvp" (UID: "249e62bb-dc30-4b99-98be-87bddf8915ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:43.879061 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:43.878971 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:43.879215 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:43.879091 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:44.878733 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:44.878698 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:44.879801 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:44.878875 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:45.879050 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:45.878791 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:45.879050 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:45.878909 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:46.878370 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:46.878336 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:46.878844 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:46.878508 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:47.427716 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:47.427165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:47.427716 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:47.427311 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:47.427716 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:47.427375 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs podName:aca4cb2c-c97a-48d4-a12d-3316ad904fc3 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:55.42735672 +0000 UTC m=+18.117288515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs") pod "network-metrics-daemon-shf7m" (UID: "aca4cb2c-c97a-48d4-a12d-3316ad904fc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:47.528239 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:47.528194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p98r\" (UniqueName: \"kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r\") pod \"network-check-target-vcxvp\" (UID: \"249e62bb-dc30-4b99-98be-87bddf8915ee\") " pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:47.528420 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:47.528384 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:47.528420 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:47.528411 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:47.528569 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:47.528425 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5p98r for pod openshift-network-diagnostics/network-check-target-vcxvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:47.528569 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:47.528512 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r podName:249e62bb-dc30-4b99-98be-87bddf8915ee nodeName:}" failed. No retries permitted until 2026-04-16 13:11:55.528491838 +0000 UTC m=+18.218423641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5p98r" (UniqueName: "kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r") pod "network-check-target-vcxvp" (UID: "249e62bb-dc30-4b99-98be-87bddf8915ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:47.880300 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:47.879743 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:47.880300 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:47.879882 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:48.878471 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:48.878422 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:48.878668 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:48.878586 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:49.878308 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:49.878275 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:49.878753 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:49.878391 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:50.878193 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:50.878157 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:50.878384 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:50.878279 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:51.878569 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:51.878539 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:51.879008 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:51.878659 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:52.878248 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:52.878212 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:52.878436 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:52.878326 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:53.878253 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:53.878210 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:53.878713 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:53.878339 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:54.878882 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:54.878796 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:54.879239 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:54.878919 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:55.486495 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:55.486441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:55.486683 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:55.486593 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:55.486683 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:55.486677 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs podName:aca4cb2c-c97a-48d4-a12d-3316ad904fc3 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:11.486660144 +0000 UTC m=+34.176591937 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs") pod "network-metrics-daemon-shf7m" (UID: "aca4cb2c-c97a-48d4-a12d-3316ad904fc3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:55.587630 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:55.587595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p98r\" (UniqueName: \"kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r\") pod \"network-check-target-vcxvp\" (UID: \"249e62bb-dc30-4b99-98be-87bddf8915ee\") " pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:55.587808 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:55.587786 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:55.587862 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:55.587815 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:55.587862 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:55.587830 2574 projected.go:194] Error preparing data for projected volume kube-api-access-5p98r for pod openshift-network-diagnostics/network-check-target-vcxvp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:55.587929 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:55.587881 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r podName:249e62bb-dc30-4b99-98be-87bddf8915ee nodeName:}" failed. No retries permitted until 2026-04-16 13:12:11.587866934 +0000 UTC m=+34.277798727 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5p98r" (UniqueName: "kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r") pod "network-check-target-vcxvp" (UID: "249e62bb-dc30-4b99-98be-87bddf8915ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:55.878154 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:55.878074 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:55.878318 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:55.878185 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:56.878251 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:56.878216 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:56.878753 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:56.878353 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:57.879300 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:57.879110 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:57.879944 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:57.879365 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:11:58.003169 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.002951 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-5nl2m" event={"ID":"a278bcef-bcae-4a88-80bc-8c4850ab20b1","Type":"ContainerStarted","Data":"bc1eef742638c94a4cf1c8b9f1d1b1d2967e6ad22fc934de62ed48f67eb06f87"} Apr 16 13:11:58.005171 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.005137 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" event={"ID":"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865","Type":"ContainerStarted","Data":"ceb9de6826afb32d7706d02142e2551dc55c32865bb1684357ae383522b2076f"} Apr 16 13:11:58.006277 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.006256 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h97bc" event={"ID":"0f795d25-9d1f-4d82-807d-5a1c84b53f53","Type":"ContainerStarted","Data":"310d405451fac9b297299a2802001e02b650c9be1ae22d6261052475e5c78829"} Apr 16 13:11:58.008071 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.008051 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" event={"ID":"d9c9aa14-1a19-474d-89b7-c6b76aed5c4a","Type":"ContainerStarted","Data":"ac2df3986322bb22d2300f79130ca3183f2e0a1e3c9932189850b88ff6fe6f07"} Apr 16 13:11:58.009295 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.009277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-j7cl9" event={"ID":"89d8b508-8fec-42a0-b382-7b625093ae71","Type":"ContainerStarted","Data":"b404477d6dced2b5fca49842813796064db18f2304e6a0247b1cbdf79f149a51"} Apr 16 13:11:58.010819 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.010802 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:11:58.011113 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.011096 2574 generic.go:358] "Generic (PLEG): container finished" podID="abe3d854-132c-4011-9672-f19aca1e1f71" containerID="78e0cd8d3cb424171a714a487f942af569c1822cd54105ce10371ba521af81bc" exitCode=1 Apr 16 13:11:58.011196 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.011152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerDied","Data":"78e0cd8d3cb424171a714a487f942af569c1822cd54105ce10371ba521af81bc"} Apr 16 13:11:58.011196 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.011167 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerStarted","Data":"84f7c0aa095111c773b4e526fa7f8dbd266b7a38a96fb1fc820c3b26c8904bb4"} Apr 16 13:11:58.012329 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.012273 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-rcb6c" event={"ID":"2f6415fd-c655-4953-99ef-9575b096b3e2","Type":"ContainerStarted","Data":"a0834f7b54d507021fdf3668c87b89484d34fbe2d53cdb6c4df9f2fa28f20cdd"} Apr 16 13:11:58.013472 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.013441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" event={"ID":"de1ece58-c2c4-43a2-9674-0ba69c468ab3","Type":"ContainerStarted","Data":"8a020a39d5fa77b0adaf79f7757ef8b7516e7f935c0c72bbd1ed9cbe692558d5"} Apr 16 13:11:58.017656 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.017616 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-5nl2m" podStartSLOduration=12.041076519 podStartE2EDuration="21.017606721s" podCreationTimestamp="2026-04-16 13:11:37 +0000 UTC" firstStartedPulling="2026-04-16 13:11:40.379155411 +0000 UTC m=+3.069087209" lastFinishedPulling="2026-04-16 13:11:49.355685603 +0000 UTC m=+12.045617411" observedRunningTime="2026-04-16 13:11:58.017095684 +0000 UTC m=+20.707027498" watchObservedRunningTime="2026-04-16 13:11:58.017606721 +0000 UTC m=+20.707538536" Apr 16 13:11:58.032878 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.032829 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h97bc" podStartSLOduration=3.942436366 podStartE2EDuration="21.032813454s" podCreationTimestamp="2026-04-16 13:11:37 +0000 UTC" firstStartedPulling="2026-04-16 13:11:40.372680552 +0000 UTC m=+3.062612358" lastFinishedPulling="2026-04-16 13:11:57.463057653 +0000 UTC m=+20.152989446" observedRunningTime="2026-04-16 13:11:58.032711162 +0000 UTC m=+20.722642979" watchObservedRunningTime="2026-04-16 13:11:58.032813454 +0000 UTC m=+20.722745270" Apr 16 13:11:58.049073 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.049035 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-j7cl9" podStartSLOduration=3.983199667 podStartE2EDuration="21.049021185s" podCreationTimestamp="2026-04-16 13:11:37 +0000 UTC" firstStartedPulling="2026-04-16 13:11:40.367866005 +0000 UTC m=+3.057797812" lastFinishedPulling="2026-04-16 13:11:57.433687534 +0000 UTC m=+20.123619330" observedRunningTime="2026-04-16 13:11:58.048653302 +0000 UTC m=+20.738585118" watchObservedRunningTime="2026-04-16 13:11:58.049021185 +0000 UTC m=+20.738953000" Apr 16 13:11:58.065904 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.065861 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mqpkt" podStartSLOduration=3.971491007 podStartE2EDuration="21.06584778s" podCreationTimestamp="2026-04-16 13:11:37 +0000 UTC" firstStartedPulling="2026-04-16 13:11:40.369571182 +0000 UTC m=+3.059502975" lastFinishedPulling="2026-04-16 13:11:57.46392794 +0000 UTC m=+20.153859748" observedRunningTime="2026-04-16 13:11:58.065765537 +0000 UTC m=+20.755697351" watchObservedRunningTime="2026-04-16 13:11:58.06584778 +0000 UTC m=+20.755779595" Apr 16 13:11:58.100538 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.100495 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-rcb6c" podStartSLOduration=3.042379662 podStartE2EDuration="20.10048128s" podCreationTimestamp="2026-04-16 13:11:38 +0000 UTC" firstStartedPulling="2026-04-16 13:11:40.375679697 +0000 UTC m=+3.065611494" lastFinishedPulling="2026-04-16 13:11:57.433781305 +0000 UTC m=+20.123713112" observedRunningTime="2026-04-16 13:11:58.100297228 +0000 UTC m=+20.790229043" watchObservedRunningTime="2026-04-16 13:11:58.10048128 +0000 UTC m=+20.790413094" Apr 16 13:11:58.282671 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.282416 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:58.283184 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.283165 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:58.565689 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.565589 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:58.566292 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.566262 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-5nl2m" Apr 16 13:11:58.878292 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:58.878219 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:11:58.878427 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:58.878335 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:11:59.017642 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.017610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hqlgp" event={"ID":"f731cddc-60f5-4b83-954c-670a96ec5019","Type":"ContainerStarted","Data":"044e03f7ec7b6dd9a564e8d93175d89253df11c588941beb5bb8b1361f089fe5"} Apr 16 13:11:59.019966 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.019948 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:11:59.020281 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.020264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerStarted","Data":"9f009f94cee0f6608c378d7cd6f60149e739822fbfcce15f1f3903e2ec817bd4"} Apr 16 13:11:59.020338 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.020292 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerStarted","Data":"4fe837835ed4cc39017734f5c6acddb4a1fe20b95280033ed59c552381dab792"} Apr 16 13:11:59.020338 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.020304 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerStarted","Data":"4eedd38695db3f5f07fdb856bb587adeeb475e62c443eb9bdf8dc11d18b80fec"} Apr 16 13:11:59.020338 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.020312 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerStarted","Data":"5c49d80fbbfebeab922ee4e660d53e520418683752227dc4712dd556e9abb0d2"} Apr 16 13:11:59.021530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.021508 2574 generic.go:358] "Generic (PLEG): container finished" podID="de1ece58-c2c4-43a2-9674-0ba69c468ab3" containerID="8a020a39d5fa77b0adaf79f7757ef8b7516e7f935c0c72bbd1ed9cbe692558d5" exitCode=0 Apr 16 13:11:59.021617 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.021596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" event={"ID":"de1ece58-c2c4-43a2-9674-0ba69c468ab3","Type":"ContainerDied","Data":"8a020a39d5fa77b0adaf79f7757ef8b7516e7f935c0c72bbd1ed9cbe692558d5"} Apr 16 13:11:59.051436 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.051395 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hqlgp" podStartSLOduration=4.988936059 podStartE2EDuration="22.051379319s" podCreationTimestamp="2026-04-16 13:11:37 +0000 UTC" firstStartedPulling="2026-04-16 13:11:40.371241717 +0000 UTC m=+3.061173514" lastFinishedPulling="2026-04-16 13:11:57.433684978 +0000 UTC m=+20.123616774" observedRunningTime="2026-04-16 13:11:59.031737186 +0000 UTC m=+21.721669013" watchObservedRunningTime="2026-04-16 13:11:59.051379319 +0000 UTC m=+21.741311134" Apr 16 13:11:59.090364 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.090336 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:11:59.831191 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.831060 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:11:59.090355086Z","UUID":"eace7c94-e991-459b-a4d5-e17f50836423","Handler":null,"Name":"","Endpoint":""} Apr 16 13:11:59.833251 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.833229 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:11:59.833397 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.833261 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:11:59.878999 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:11:59.878968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:11:59.879186 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:11:59.879124 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:12:00.025152 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:00.024919 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" event={"ID":"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865","Type":"ContainerStarted","Data":"75f7fdfc3937116bd400eb9e9a296e908783944b955e485dd5a12e33d04f5724"} Apr 16 13:12:00.878077 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:00.877997 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:12:00.878240 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:00.878195 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:12:01.030613 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:01.030584 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:12:01.031048 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:01.030966 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerStarted","Data":"4d9d6a4171cb23ee7f159bc47b021396c8468556a34bd6ed808ec97017e58814"} Apr 16 13:12:01.032865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:01.032831 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" event={"ID":"07f5dcd1-3ca5-4bf2-9734-b2a40b5d1865","Type":"ContainerStarted","Data":"d69637cbbf01e363d30385ece63cfa3d4006f58d9b3122276de00d901adb1715"} Apr 16 13:12:01.054481 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:01.054403 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g59px" podStartSLOduration=3.954351548 podStartE2EDuration="24.054385674s" podCreationTimestamp="2026-04-16 13:11:37 +0000 UTC" firstStartedPulling="2026-04-16 13:11:40.377990266 +0000 UTC m=+3.067922071" lastFinishedPulling="2026-04-16 13:12:00.47802439 +0000 UTC m=+23.167956197" observedRunningTime="2026-04-16 13:12:01.053748699 +0000 UTC m=+23.743680513" watchObservedRunningTime="2026-04-16 13:12:01.054385674 +0000 UTC m=+23.744317490" Apr 16 13:12:01.878671 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:01.878464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:01.878864 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:01.878753 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:12:02.878805 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:02.878780 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:12:02.879336 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:02.878876 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:12:03.039071 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:03.039052 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:12:03.039420 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:03.039395 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerStarted","Data":"2b6ff47159e8cdd3809c507e05a5ffdb8ddbbf93250f091b47a526dcefc4130b"} Apr 16 13:12:03.039775 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:03.039749 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:12:03.039775 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:03.039783 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:12:03.039993 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:03.039933 2574 scope.go:117] "RemoveContainer" containerID="78e0cd8d3cb424171a714a487f942af569c1822cd54105ce10371ba521af81bc" Apr 16 13:12:03.055539 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:03.055361 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:12:03.057341 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:03.057202 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:12:03.653718 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:03.653681 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:12:03.878070 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:03.878047 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:03.878185 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:03.878132 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:12:04.044851 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:04.044823 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:12:04.045422 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:04.045173 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" event={"ID":"abe3d854-132c-4011-9672-f19aca1e1f71","Type":"ContainerStarted","Data":"589bdbf6074d47f456419ff3e4303e4607801e3ecfd92ca93630681fcd5bbc5e"} Apr 16 13:12:04.046678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:04.046652 2574 generic.go:358] "Generic (PLEG): container finished" podID="de1ece58-c2c4-43a2-9674-0ba69c468ab3" containerID="582489d5daa1d699a5b015a64576a15dfe0dd3ed37a052dd9a2d03ccccb4865c" exitCode=0 Apr 16 13:12:04.046757 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:04.046690 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" event={"ID":"de1ece58-c2c4-43a2-9674-0ba69c468ab3","Type":"ContainerDied","Data":"582489d5daa1d699a5b015a64576a15dfe0dd3ed37a052dd9a2d03ccccb4865c"} Apr 16 13:12:04.076364 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:04.076318 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" podStartSLOduration=8.931713947 podStartE2EDuration="26.076305735s" podCreationTimestamp="2026-04-16 13:11:38 +0000 UTC" firstStartedPulling="2026-04-16 13:11:40.377177873 +0000 UTC m=+3.067109679" lastFinishedPulling="2026-04-16 13:11:57.521769673 +0000 UTC m=+20.211701467" observedRunningTime="2026-04-16 13:12:04.074791573 +0000 UTC m=+26.764723387" watchObservedRunningTime="2026-04-16 13:12:04.076305735 +0000 UTC m=+26.766237546" Apr 16 13:12:04.878952 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:04.878915 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:12:04.879091 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:04.879059 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:12:04.891026 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:04.890975 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-shf7m"] Apr 16 13:12:04.891708 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:04.891679 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vcxvp"] Apr 16 13:12:04.891819 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:04.891806 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:04.891940 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:04.891917 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:12:05.050890 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:05.050855 2574 generic.go:358] "Generic (PLEG): container finished" podID="de1ece58-c2c4-43a2-9674-0ba69c468ab3" containerID="d80d78f90ca06ea118ea941f0b43cd95d5a6cdb95dc186d0c783554c06a0fe7c" exitCode=0 Apr 16 13:12:05.051383 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:05.050943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" event={"ID":"de1ece58-c2c4-43a2-9674-0ba69c468ab3","Type":"ContainerDied","Data":"d80d78f90ca06ea118ea941f0b43cd95d5a6cdb95dc186d0c783554c06a0fe7c"} Apr 16 13:12:05.052167 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:05.051586 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:12:05.052167 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:05.051672 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:12:06.054952 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:06.054844 2574 generic.go:358] "Generic (PLEG): container finished" podID="de1ece58-c2c4-43a2-9674-0ba69c468ab3" containerID="f42cc35a8c56356a8b805d3fb1f34747d959fdc3ccd82f4401e9a7351408fd04" exitCode=0 Apr 16 13:12:06.054952 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:06.054893 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" event={"ID":"de1ece58-c2c4-43a2-9674-0ba69c468ab3","Type":"ContainerDied","Data":"f42cc35a8c56356a8b805d3fb1f34747d959fdc3ccd82f4401e9a7351408fd04"} Apr 16 13:12:06.878144 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:06.878113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:06.878325 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:06.878121 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:12:06.878325 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:06.878230 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:12:06.878427 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:06.878331 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:12:08.878943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:08.878664 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:08.879411 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:08.878682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:12:08.879411 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:08.878986 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vcxvp" podUID="249e62bb-dc30-4b99-98be-87bddf8915ee" Apr 16 13:12:08.879411 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:08.879091 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:12:10.582610 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.582578 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-242.ec2.internal" event="NodeReady" Apr 16 13:12:10.583196 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.582727 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:12:10.622101 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.622068 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5ghzg"] Apr 16 13:12:10.626116 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.626082 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vgz9p"] Apr 16 13:12:10.626275 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.626257 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.628933 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.628912 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:12:10.629066 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.628972 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkkx5\"" Apr 16 13:12:10.629066 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.629012 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:10.629066 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.629024 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:12:10.631506 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.631381 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:12:10.631506 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.631408 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:12:10.631683 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.631512 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:12:10.631683 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.631573 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrfd8\"" Apr 16 13:12:10.633797 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.633757 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5ghzg"] Apr 16 13:12:10.636147 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.636123 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vgz9p"] Apr 16 13:12:10.705439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.705402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31b5bb3a-e40c-4101-9afe-159f4de5432f-tmp-dir\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.705627 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.705479 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31b5bb3a-e40c-4101-9afe-159f4de5432f-config-volume\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.705627 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.705503 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:10.705627 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.705517 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.705627 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.705615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qhbx\" (UniqueName: \"kubernetes.io/projected/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-kube-api-access-9qhbx\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:10.705815 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.705665 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hbzt\" (UniqueName: \"kubernetes.io/projected/31b5bb3a-e40c-4101-9afe-159f4de5432f-kube-api-access-8hbzt\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.806989 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.806904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qhbx\" (UniqueName: \"kubernetes.io/projected/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-kube-api-access-9qhbx\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:10.806989 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.806945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8hbzt\" (UniqueName: \"kubernetes.io/projected/31b5bb3a-e40c-4101-9afe-159f4de5432f-kube-api-access-8hbzt\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.806989 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.806973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31b5bb3a-e40c-4101-9afe-159f4de5432f-tmp-dir\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.807275 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.807093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31b5bb3a-e40c-4101-9afe-159f4de5432f-config-volume\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.807275 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.807125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:10.807275 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.807150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.807275 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:10.807252 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:10.807487 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:10.807309 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls podName:31b5bb3a-e40c-4101-9afe-159f4de5432f nodeName:}" failed. No retries permitted until 2026-04-16 13:12:11.307288836 +0000 UTC m=+33.997220634 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls") pod "dns-default-5ghzg" (UID: "31b5bb3a-e40c-4101-9afe-159f4de5432f") : secret "dns-default-metrics-tls" not found Apr 16 13:12:10.807487 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:10.807391 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:10.807487 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:10.807433 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert podName:de85bd7a-e5fd-4893-9f0b-d98d2bc4c225 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:11.307418402 +0000 UTC m=+33.997350195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert") pod "ingress-canary-vgz9p" (UID: "de85bd7a-e5fd-4893-9f0b-d98d2bc4c225") : secret "canary-serving-cert" not found Apr 16 13:12:10.807699 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.807672 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/31b5bb3a-e40c-4101-9afe-159f4de5432f-tmp-dir\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.807699 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.807678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/31b5bb3a-e40c-4101-9afe-159f4de5432f-config-volume\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.817895 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.817866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hbzt\" (UniqueName: \"kubernetes.io/projected/31b5bb3a-e40c-4101-9afe-159f4de5432f-kube-api-access-8hbzt\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:10.818053 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.817927 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qhbx\" (UniqueName: \"kubernetes.io/projected/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-kube-api-access-9qhbx\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:10.878048 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.878004 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:12:10.878216 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.878008 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:10.881001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.880977 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:12:10.881001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.880989 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hkf74\"" Apr 16 13:12:10.881192 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.880981 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:12:10.881192 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.880992 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhs75\"" Apr 16 13:12:10.881192 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:10.880996 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:12:11.312037 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:11.312004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:11.312037 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:11.312041 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:11.312252 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:11.312159 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:11.312252 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:11.312217 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls podName:31b5bb3a-e40c-4101-9afe-159f4de5432f nodeName:}" failed. No retries permitted until 2026-04-16 13:12:12.312202205 +0000 UTC m=+35.002133997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls") pod "dns-default-5ghzg" (UID: "31b5bb3a-e40c-4101-9afe-159f4de5432f") : secret "dns-default-metrics-tls" not found Apr 16 13:12:11.312252 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:11.312159 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:11.312252 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:11.312254 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert podName:de85bd7a-e5fd-4893-9f0b-d98d2bc4c225 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:12.312245524 +0000 UTC m=+35.002177319 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert") pod "ingress-canary-vgz9p" (UID: "de85bd7a-e5fd-4893-9f0b-d98d2bc4c225") : secret "canary-serving-cert" not found Apr 16 13:12:11.514165 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:11.514112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:12:11.514344 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:11.514283 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:12:11.514407 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:11.514360 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs podName:aca4cb2c-c97a-48d4-a12d-3316ad904fc3 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:43.51434274 +0000 UTC m=+66.204274533 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs") pod "network-metrics-daemon-shf7m" (UID: "aca4cb2c-c97a-48d4-a12d-3316ad904fc3") : secret "metrics-daemon-secret" not found Apr 16 13:12:11.615037 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:11.614955 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p98r\" (UniqueName: \"kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r\") pod \"network-check-target-vcxvp\" (UID: \"249e62bb-dc30-4b99-98be-87bddf8915ee\") " pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:11.617501 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:11.617482 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p98r\" (UniqueName: \"kubernetes.io/projected/249e62bb-dc30-4b99-98be-87bddf8915ee-kube-api-access-5p98r\") pod \"network-check-target-vcxvp\" (UID: \"249e62bb-dc30-4b99-98be-87bddf8915ee\") " pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:11.796916 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:11.796882 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:11.972589 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:11.972287 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vcxvp"] Apr 16 13:12:11.975242 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:12:11.975213 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod249e62bb_dc30_4b99_98be_87bddf8915ee.slice/crio-e6c8ad339097fde398396a18c1bcb6e0ba98347bbb1a06debf0c2a54fe882885 WatchSource:0}: Error finding container e6c8ad339097fde398396a18c1bcb6e0ba98347bbb1a06debf0c2a54fe882885: Status 404 returned error can't find the container with id e6c8ad339097fde398396a18c1bcb6e0ba98347bbb1a06debf0c2a54fe882885 Apr 16 13:12:12.067365 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:12.067332 2574 generic.go:358] "Generic (PLEG): container finished" podID="de1ece58-c2c4-43a2-9674-0ba69c468ab3" containerID="c444a5a3c2c8fcad6806e886bdf14128781490bc1c184f292bc6da6e3eaa2790" exitCode=0 Apr 16 13:12:12.067529 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:12.067410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" event={"ID":"de1ece58-c2c4-43a2-9674-0ba69c468ab3","Type":"ContainerDied","Data":"c444a5a3c2c8fcad6806e886bdf14128781490bc1c184f292bc6da6e3eaa2790"} Apr 16 13:12:12.068362 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:12.068346 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vcxvp" event={"ID":"249e62bb-dc30-4b99-98be-87bddf8915ee","Type":"ContainerStarted","Data":"e6c8ad339097fde398396a18c1bcb6e0ba98347bbb1a06debf0c2a54fe882885"} Apr 16 13:12:12.320817 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:12.320778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:12.320817 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:12.320825 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:12.321066 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:12.320932 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:12.321066 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:12.321000 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert podName:de85bd7a-e5fd-4893-9f0b-d98d2bc4c225 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:14.32098348 +0000 UTC m=+37.010915273 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert") pod "ingress-canary-vgz9p" (UID: "de85bd7a-e5fd-4893-9f0b-d98d2bc4c225") : secret "canary-serving-cert" not found Apr 16 13:12:12.321066 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:12.320936 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:12.321191 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:12.321084 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls podName:31b5bb3a-e40c-4101-9afe-159f4de5432f nodeName:}" failed. No retries permitted until 2026-04-16 13:12:14.321067477 +0000 UTC m=+37.010999276 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls") pod "dns-default-5ghzg" (UID: "31b5bb3a-e40c-4101-9afe-159f4de5432f") : secret "dns-default-metrics-tls" not found Apr 16 13:12:13.073596 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:13.073563 2574 generic.go:358] "Generic (PLEG): container finished" podID="de1ece58-c2c4-43a2-9674-0ba69c468ab3" containerID="b87ea38f604d2d56afb61dab959a9ecc4bf762051468c94c0aa88b26295b2c6d" exitCode=0 Apr 16 13:12:13.074040 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:13.073646 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" event={"ID":"de1ece58-c2c4-43a2-9674-0ba69c468ab3","Type":"ContainerDied","Data":"b87ea38f604d2d56afb61dab959a9ecc4bf762051468c94c0aa88b26295b2c6d"} Apr 16 13:12:14.079338 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:14.079262 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" event={"ID":"de1ece58-c2c4-43a2-9674-0ba69c468ab3","Type":"ContainerStarted","Data":"9a68f8a67e982b8c5409c524490fafaecb22112eab52fe91e24f5cddc6dac99a"} Apr 16 13:12:14.102679 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:14.102630 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lcmf6" podStartSLOduration=4.733916576 podStartE2EDuration="36.10261526s" podCreationTimestamp="2026-04-16 13:11:38 +0000 UTC" firstStartedPulling="2026-04-16 13:11:40.369924989 +0000 UTC m=+3.059856785" lastFinishedPulling="2026-04-16 13:12:11.738623666 +0000 UTC m=+34.428555469" observedRunningTime="2026-04-16 13:12:14.102380827 +0000 UTC m=+36.792312641" watchObservedRunningTime="2026-04-16 13:12:14.10261526 +0000 UTC m=+36.792547074" Apr 16 13:12:14.337015 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:14.336924 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:14.337015 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:14.336976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:14.337250 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:14.337097 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:14.337250 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:14.337100 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:14.337250 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:14.337163 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls podName:31b5bb3a-e40c-4101-9afe-159f4de5432f nodeName:}" failed. No retries permitted until 2026-04-16 13:12:18.337148302 +0000 UTC m=+41.027080095 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls") pod "dns-default-5ghzg" (UID: "31b5bb3a-e40c-4101-9afe-159f4de5432f") : secret "dns-default-metrics-tls" not found Apr 16 13:12:14.337250 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:14.337177 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert podName:de85bd7a-e5fd-4893-9f0b-d98d2bc4c225 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:18.337170955 +0000 UTC m=+41.027102748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert") pod "ingress-canary-vgz9p" (UID: "de85bd7a-e5fd-4893-9f0b-d98d2bc4c225") : secret "canary-serving-cert" not found Apr 16 13:12:15.082250 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:15.081987 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vcxvp" event={"ID":"249e62bb-dc30-4b99-98be-87bddf8915ee","Type":"ContainerStarted","Data":"8ca2337156adb848ec0e6e47d5bf9a59071c974364634fe4969973dfe921a16f"} Apr 16 13:12:15.082630 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:15.082375 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:12:15.098006 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:15.097961 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vcxvp" podStartSLOduration=35.189656436 podStartE2EDuration="38.097948882s" podCreationTimestamp="2026-04-16 13:11:37 +0000 UTC" firstStartedPulling="2026-04-16 13:12:11.977307796 +0000 UTC m=+34.667239589" lastFinishedPulling="2026-04-16 13:12:14.885600243 +0000 UTC m=+37.575532035" observedRunningTime="2026-04-16 13:12:15.09750586 +0000 UTC m=+37.787437676" watchObservedRunningTime="2026-04-16 13:12:15.097948882 +0000 UTC m=+37.787880697" Apr 16 13:12:18.366158 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:18.366113 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:18.366158 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:18.366158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:18.366690 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:18.366267 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:18.366690 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:18.366269 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:18.366690 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:18.366339 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls podName:31b5bb3a-e40c-4101-9afe-159f4de5432f nodeName:}" failed. No retries permitted until 2026-04-16 13:12:26.366317752 +0000 UTC m=+49.056249545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls") pod "dns-default-5ghzg" (UID: "31b5bb3a-e40c-4101-9afe-159f4de5432f") : secret "dns-default-metrics-tls" not found Apr 16 13:12:18.366690 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:18.366361 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert podName:de85bd7a-e5fd-4893-9f0b-d98d2bc4c225 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:26.366351605 +0000 UTC m=+49.056283401 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert") pod "ingress-canary-vgz9p" (UID: "de85bd7a-e5fd-4893-9f0b-d98d2bc4c225") : secret "canary-serving-cert" not found Apr 16 13:12:26.419724 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:26.419684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:26.419724 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:26.419727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:26.420216 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:26.419827 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:26.420216 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:26.419830 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:26.420216 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:26.419884 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert podName:de85bd7a-e5fd-4893-9f0b-d98d2bc4c225 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:42.419868383 +0000 UTC m=+65.109800176 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert") pod "ingress-canary-vgz9p" (UID: "de85bd7a-e5fd-4893-9f0b-d98d2bc4c225") : secret "canary-serving-cert" not found Apr 16 13:12:26.420216 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:26.419900 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls podName:31b5bb3a-e40c-4101-9afe-159f4de5432f nodeName:}" failed. No retries permitted until 2026-04-16 13:12:42.41989397 +0000 UTC m=+65.109825762 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls") pod "dns-default-5ghzg" (UID: "31b5bb3a-e40c-4101-9afe-159f4de5432f") : secret "dns-default-metrics-tls" not found Apr 16 13:12:35.061187 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:35.061153 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xp25r" Apr 16 13:12:42.435847 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:42.435806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:12:42.435847 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:42.435847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:12:42.436254 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:42.435943 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:42.436254 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:42.435961 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:42.436254 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:42.436003 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert podName:de85bd7a-e5fd-4893-9f0b-d98d2bc4c225 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:14.435986638 +0000 UTC m=+97.125918431 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert") pod "ingress-canary-vgz9p" (UID: "de85bd7a-e5fd-4893-9f0b-d98d2bc4c225") : secret "canary-serving-cert" not found Apr 16 13:12:42.436254 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:42.436019 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls podName:31b5bb3a-e40c-4101-9afe-159f4de5432f nodeName:}" failed. No retries permitted until 2026-04-16 13:13:14.436011861 +0000 UTC m=+97.125943654 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls") pod "dns-default-5ghzg" (UID: "31b5bb3a-e40c-4101-9afe-159f4de5432f") : secret "dns-default-metrics-tls" not found Apr 16 13:12:43.542471 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:43.542404 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:12:43.542861 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:43.542553 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:12:43.542861 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:12:43.542615 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs podName:aca4cb2c-c97a-48d4-a12d-3316ad904fc3 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:47.542598914 +0000 UTC m=+130.232530707 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs") pod "network-metrics-daemon-shf7m" (UID: "aca4cb2c-c97a-48d4-a12d-3316ad904fc3") : secret "metrics-daemon-secret" not found Apr 16 13:12:47.087770 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:12:47.087742 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vcxvp" Apr 16 13:13:14.444550 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:14.444508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:13:14.444550 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:14.444554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:13:14.444900 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:14.444658 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:13:14.444900 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:14.444719 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls podName:31b5bb3a-e40c-4101-9afe-159f4de5432f nodeName:}" failed. No retries permitted until 2026-04-16 13:14:18.444704862 +0000 UTC m=+161.134636655 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls") pod "dns-default-5ghzg" (UID: "31b5bb3a-e40c-4101-9afe-159f4de5432f") : secret "dns-default-metrics-tls" not found Apr 16 13:13:14.444900 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:14.444658 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:13:14.444900 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:14.444801 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert podName:de85bd7a-e5fd-4893-9f0b-d98d2bc4c225 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:18.444787784 +0000 UTC m=+161.134719591 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert") pod "ingress-canary-vgz9p" (UID: "de85bd7a-e5fd-4893-9f0b-d98d2bc4c225") : secret "canary-serving-cert" not found Apr 16 13:13:46.517206 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.517167 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm"] Apr 16 13:13:46.519713 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.519693 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm" Apr 16 13:13:46.522477 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.522445 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:13:46.522717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.522699 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 13:13:46.523971 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.523955 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-xlfdj\"" Apr 16 13:13:46.527154 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.527132 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm"] Apr 16 13:13:46.617940 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.617902 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p"] Apr 16 13:13:46.620533 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.620506 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.621977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.621955 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57c969648d-q8lrq"] Apr 16 13:13:46.623855 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.623817 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 13:13:46.624294 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.624277 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-b2hl4\"" Apr 16 13:13:46.624731 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.624712 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 13:13:46.625055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.625040 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 13:13:46.626181 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.626164 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:13:46.627666 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.627645 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-ppgg7"] Apr 16 13:13:46.635702 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.635674 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.648773 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.648742 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 13:13:46.648985 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.648796 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sxsq4\"" Apr 16 13:13:46.649059 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.648843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 13:13:46.649175 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.648882 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 13:13:46.650285 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.650261 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p"] Apr 16 13:13:46.650479 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.650444 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57c969648d-q8lrq"] Apr 16 13:13:46.650713 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.650697 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.651053 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.651029 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-ppgg7"] Apr 16 13:13:46.652610 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.652591 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 13:13:46.653511 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.653489 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-hpzz6\"" Apr 16 13:13:46.653704 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.653680 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 13:13:46.653790 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.653739 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 13:13:46.653790 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.653750 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:13:46.654580 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.654562 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 13:13:46.659092 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.659065 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsqzj\" (UniqueName: \"kubernetes.io/projected/b5ad3b81-d3af-41e5-9cd8-ee5f402461da-kube-api-access-nsqzj\") pod \"volume-data-source-validator-7d955d5dd4-n6nwm\" (UID: \"b5ad3b81-d3af-41e5-9cd8-ee5f402461da\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm" Apr 16 13:13:46.660211 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.660194 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 13:13:46.719000 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.718966 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m"] Apr 16 13:13:46.721639 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.721615 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:46.724789 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.724768 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-4lthm\"" Apr 16 13:13:46.724933 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.724852 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:13:46.725112 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.725099 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 13:13:46.725402 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.725387 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 13:13:46.726059 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.726041 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 13:13:46.737380 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.737352 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m"] Apr 16 13:13:46.759854 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.759823 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkxrg\" (UniqueName: \"kubernetes.io/projected/53d7350f-ab9a-4dba-a9fe-1a1980008f10-kube-api-access-vkxrg\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.759854 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.759857 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-image-registry-private-configuration\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.760068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.759881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8g9p\" (UniqueName: \"kubernetes.io/projected/93de10c9-feba-4c72-8314-241a4ff3342c-kube-api-access-c8g9p\") pod \"service-ca-operator-69965bb79d-qp97p\" (UID: \"93de10c9-feba-4c72-8314-241a4ff3342c\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.760068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.759909 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-certificates\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.760068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.759929 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-bound-sa-token\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.760068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.759947 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d7350f-ab9a-4dba-a9fe-1a1980008f10-config\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.760068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.759964 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-trusted-ca\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.760068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.760032 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsqzj\" (UniqueName: \"kubernetes.io/projected/b5ad3b81-d3af-41e5-9cd8-ee5f402461da-kube-api-access-nsqzj\") pod \"volume-data-source-validator-7d955d5dd4-n6nwm\" (UID: \"b5ad3b81-d3af-41e5-9cd8-ee5f402461da\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm" Apr 16 13:13:46.760068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.760062 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.760329 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.760087 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-installation-pull-secrets\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.760329 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.760107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgcfq\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-kube-api-access-lgcfq\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.760329 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.760160 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-ca-trust-extracted\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.760329 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.760182 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93de10c9-feba-4c72-8314-241a4ff3342c-config\") pod \"service-ca-operator-69965bb79d-qp97p\" (UID: \"93de10c9-feba-4c72-8314-241a4ff3342c\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.760329 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.760198 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53d7350f-ab9a-4dba-a9fe-1a1980008f10-serving-cert\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.760329 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.760233 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93de10c9-feba-4c72-8314-241a4ff3342c-serving-cert\") pod \"service-ca-operator-69965bb79d-qp97p\" (UID: \"93de10c9-feba-4c72-8314-241a4ff3342c\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.760329 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.760270 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53d7350f-ab9a-4dba-a9fe-1a1980008f10-trusted-ca\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.773110 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.773048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsqzj\" (UniqueName: \"kubernetes.io/projected/b5ad3b81-d3af-41e5-9cd8-ee5f402461da-kube-api-access-nsqzj\") pod \"volume-data-source-validator-7d955d5dd4-n6nwm\" (UID: \"b5ad3b81-d3af-41e5-9cd8-ee5f402461da\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm" Apr 16 13:13:46.828985 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.828947 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm" Apr 16 13:13:46.860971 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.860923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkxrg\" (UniqueName: \"kubernetes.io/projected/53d7350f-ab9a-4dba-a9fe-1a1980008f10-kube-api-access-vkxrg\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.861159 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.860978 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-image-registry-private-configuration\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.861159 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861016 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8g9p\" (UniqueName: \"kubernetes.io/projected/93de10c9-feba-4c72-8314-241a4ff3342c-kube-api-access-c8g9p\") pod \"service-ca-operator-69965bb79d-qp97p\" (UID: \"93de10c9-feba-4c72-8314-241a4ff3342c\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.861159 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861049 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-certificates\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.861159 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-bound-sa-token\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.861369 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d7350f-ab9a-4dba-a9fe-1a1980008f10-config\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.861369 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861257 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-trusted-ca\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.861369 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.861369 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-installation-pull-secrets\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.861592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861368 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lgcfq\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-kube-api-access-lgcfq\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.861592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-ca-trust-extracted\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.861592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ea21a67-8607-4bf1-9fdc-dcdefcaf7121-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth8m\" (UID: \"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:46.861592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93de10c9-feba-4c72-8314-241a4ff3342c-config\") pod \"service-ca-operator-69965bb79d-qp97p\" (UID: \"93de10c9-feba-4c72-8314-241a4ff3342c\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.861592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53d7350f-ab9a-4dba-a9fe-1a1980008f10-serving-cert\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.861592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93de10c9-feba-4c72-8314-241a4ff3342c-serving-cert\") pod \"service-ca-operator-69965bb79d-qp97p\" (UID: \"93de10c9-feba-4c72-8314-241a4ff3342c\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.861592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ea21a67-8607-4bf1-9fdc-dcdefcaf7121-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth8m\" (UID: \"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:46.861931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53d7350f-ab9a-4dba-a9fe-1a1980008f10-trusted-ca\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.861931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtgv7\" (UniqueName: \"kubernetes.io/projected/6ea21a67-8607-4bf1-9fdc-dcdefcaf7121-kube-api-access-wtgv7\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth8m\" (UID: \"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:46.861931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861912 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-certificates\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.862078 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.861981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d7350f-ab9a-4dba-a9fe-1a1980008f10-config\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.862078 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:46.862007 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:13:46.862078 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:46.862021 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57c969648d-q8lrq: secret "image-registry-tls" not found Apr 16 13:13:46.862078 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:46.862070 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls podName:6b14168a-8ca2-4c5a-a9ba-0238e229d4b6 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:47.362051471 +0000 UTC m=+130.051983265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls") pod "image-registry-57c969648d-q8lrq" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6") : secret "image-registry-tls" not found Apr 16 13:13:46.862278 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.862185 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93de10c9-feba-4c72-8314-241a4ff3342c-config\") pod \"service-ca-operator-69965bb79d-qp97p\" (UID: \"93de10c9-feba-4c72-8314-241a4ff3342c\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.862339 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.862282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-ca-trust-extracted\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.862413 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.862380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-trusted-ca\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.863131 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.863103 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53d7350f-ab9a-4dba-a9fe-1a1980008f10-trusted-ca\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.863809 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.863780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-image-registry-private-configuration\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.864560 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.864536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53d7350f-ab9a-4dba-a9fe-1a1980008f10-serving-cert\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.864708 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.864693 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-installation-pull-secrets\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.864775 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.864718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93de10c9-feba-4c72-8314-241a4ff3342c-serving-cert\") pod \"service-ca-operator-69965bb79d-qp97p\" (UID: \"93de10c9-feba-4c72-8314-241a4ff3342c\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.873115 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.873091 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-bound-sa-token\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.873684 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.873658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgcfq\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-kube-api-access-lgcfq\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:46.874269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.873810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8g9p\" (UniqueName: \"kubernetes.io/projected/93de10c9-feba-4c72-8314-241a4ff3342c-kube-api-access-c8g9p\") pod \"service-ca-operator-69965bb79d-qp97p\" (UID: \"93de10c9-feba-4c72-8314-241a4ff3342c\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.874269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.874183 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkxrg\" (UniqueName: \"kubernetes.io/projected/53d7350f-ab9a-4dba-a9fe-1a1980008f10-kube-api-access-vkxrg\") pod \"console-operator-d87b8d5fc-ppgg7\" (UID: \"53d7350f-ab9a-4dba-a9fe-1a1980008f10\") " pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.946092 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.946059 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm"] Apr 16 13:13:46.949328 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:13:46.949298 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5ad3b81_d3af_41e5_9cd8_ee5f402461da.slice/crio-db484cf07571ad2bb683c4050555057ea6c08c4a3c475221861a0accd3d9fc4e WatchSource:0}: Error finding container db484cf07571ad2bb683c4050555057ea6c08c4a3c475221861a0accd3d9fc4e: Status 404 returned error can't find the container with id db484cf07571ad2bb683c4050555057ea6c08c4a3c475221861a0accd3d9fc4e Apr 16 13:13:46.958548 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.958522 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" Apr 16 13:13:46.962501 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.962479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ea21a67-8607-4bf1-9fdc-dcdefcaf7121-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth8m\" (UID: \"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:46.962605 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.962512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wtgv7\" (UniqueName: \"kubernetes.io/projected/6ea21a67-8607-4bf1-9fdc-dcdefcaf7121-kube-api-access-wtgv7\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth8m\" (UID: \"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:46.962605 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.962591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ea21a67-8607-4bf1-9fdc-dcdefcaf7121-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth8m\" (UID: \"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:46.963023 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.962998 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ea21a67-8607-4bf1-9fdc-dcdefcaf7121-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth8m\" (UID: \"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:46.964739 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.964719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ea21a67-8607-4bf1-9fdc-dcdefcaf7121-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth8m\" (UID: \"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:46.969526 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.969504 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:46.971358 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:46.971338 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtgv7\" (UniqueName: \"kubernetes.io/projected/6ea21a67-8607-4bf1-9fdc-dcdefcaf7121-kube-api-access-wtgv7\") pod \"kube-storage-version-migrator-operator-756bb7d76f-nth8m\" (UID: \"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:47.031592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.031497 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" Apr 16 13:13:47.080963 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.080913 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p"] Apr 16 13:13:47.087691 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:13:47.087638 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93de10c9_feba_4c72_8314_241a4ff3342c.slice/crio-c704895ee3a690ba3b1ae5be5d50bde4e1e3b32376c2c31282c27c8316b91b5b WatchSource:0}: Error finding container c704895ee3a690ba3b1ae5be5d50bde4e1e3b32376c2c31282c27c8316b91b5b: Status 404 returned error can't find the container with id c704895ee3a690ba3b1ae5be5d50bde4e1e3b32376c2c31282c27c8316b91b5b Apr 16 13:13:47.100040 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.100011 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-ppgg7"] Apr 16 13:13:47.102944 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:13:47.102909 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53d7350f_ab9a_4dba_a9fe_1a1980008f10.slice/crio-008bfcb1ca9444391928dd8d878bc4ab677b5457b9e8d30899b122acaf2bc8b0 WatchSource:0}: Error finding container 008bfcb1ca9444391928dd8d878bc4ab677b5457b9e8d30899b122acaf2bc8b0: Status 404 returned error can't find the container with id 008bfcb1ca9444391928dd8d878bc4ab677b5457b9e8d30899b122acaf2bc8b0 Apr 16 13:13:47.167286 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.167219 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m"] Apr 16 13:13:47.169798 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:13:47.169770 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ea21a67_8607_4bf1_9fdc_dcdefcaf7121.slice/crio-11712457be06d02f446527c3b4f009882c9a36d5f095cd55d48e1f85ba519262 WatchSource:0}: Error finding container 11712457be06d02f446527c3b4f009882c9a36d5f095cd55d48e1f85ba519262: Status 404 returned error can't find the container with id 11712457be06d02f446527c3b4f009882c9a36d5f095cd55d48e1f85ba519262 Apr 16 13:13:47.241788 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.241752 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" event={"ID":"93de10c9-feba-4c72-8314-241a4ff3342c","Type":"ContainerStarted","Data":"c704895ee3a690ba3b1ae5be5d50bde4e1e3b32376c2c31282c27c8316b91b5b"} Apr 16 13:13:47.242662 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.242635 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" event={"ID":"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121","Type":"ContainerStarted","Data":"11712457be06d02f446527c3b4f009882c9a36d5f095cd55d48e1f85ba519262"} Apr 16 13:13:47.243567 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.243550 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" event={"ID":"53d7350f-ab9a-4dba-a9fe-1a1980008f10","Type":"ContainerStarted","Data":"008bfcb1ca9444391928dd8d878bc4ab677b5457b9e8d30899b122acaf2bc8b0"} Apr 16 13:13:47.244444 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.244425 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm" event={"ID":"b5ad3b81-d3af-41e5-9cd8-ee5f402461da","Type":"ContainerStarted","Data":"db484cf07571ad2bb683c4050555057ea6c08c4a3c475221861a0accd3d9fc4e"} Apr 16 13:13:47.366246 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.366155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:47.366390 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:47.366267 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:13:47.366390 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:47.366280 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57c969648d-q8lrq: secret "image-registry-tls" not found Apr 16 13:13:47.366390 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:47.366335 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls podName:6b14168a-8ca2-4c5a-a9ba-0238e229d4b6 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:48.366315956 +0000 UTC m=+131.056247752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls") pod "image-registry-57c969648d-q8lrq" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6") : secret "image-registry-tls" not found Apr 16 13:13:47.567737 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:47.567691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:13:47.568141 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:47.567847 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:13:47.568141 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:47.567914 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs podName:aca4cb2c-c97a-48d4-a12d-3316ad904fc3 nodeName:}" failed. No retries permitted until 2026-04-16 13:15:49.567899079 +0000 UTC m=+252.257830877 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs") pod "network-metrics-daemon-shf7m" (UID: "aca4cb2c-c97a-48d4-a12d-3316ad904fc3") : secret "metrics-daemon-secret" not found Apr 16 13:13:48.373399 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:48.373357 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:48.373603 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:48.373547 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:13:48.373603 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:48.373571 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57c969648d-q8lrq: secret "image-registry-tls" not found Apr 16 13:13:48.373772 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:48.373658 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls podName:6b14168a-8ca2-4c5a-a9ba-0238e229d4b6 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:50.373635436 +0000 UTC m=+133.063567228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls") pod "image-registry-57c969648d-q8lrq" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6") : secret "image-registry-tls" not found Apr 16 13:13:50.389921 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:50.389894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:50.390193 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:50.390037 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:13:50.390193 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:50.390054 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57c969648d-q8lrq: secret "image-registry-tls" not found Apr 16 13:13:50.390193 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:50.390105 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls podName:6b14168a-8ca2-4c5a-a9ba-0238e229d4b6 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:54.390091007 +0000 UTC m=+137.080022800 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls") pod "image-registry-57c969648d-q8lrq" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6") : secret "image-registry-tls" not found Apr 16 13:13:51.258524 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.258485 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm" event={"ID":"b5ad3b81-d3af-41e5-9cd8-ee5f402461da","Type":"ContainerStarted","Data":"8c017896712e51bf90008fd3be0ad4f30a269aa3e9c151aea03d06362f97f9cd"} Apr 16 13:13:51.259788 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.259757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" event={"ID":"93de10c9-feba-4c72-8314-241a4ff3342c","Type":"ContainerStarted","Data":"c7c5f4b85252dd84a3bb28e98d177805d95179f1ab54661c9fa71d361b5f54ba"} Apr 16 13:13:51.261051 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.261025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" event={"ID":"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121","Type":"ContainerStarted","Data":"25e6834e934182c618df41db98859dc12f88cc13a5ad41eadbd4613c338b790b"} Apr 16 13:13:51.262398 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.262379 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/0.log" Apr 16 13:13:51.262501 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.262417 2574 generic.go:358] "Generic (PLEG): container finished" podID="53d7350f-ab9a-4dba-a9fe-1a1980008f10" containerID="cbea3f37f9efff003d908d9cded1fab328be79733bba00248881444d0529101e" exitCode=255 Apr 16 13:13:51.262501 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.262466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" event={"ID":"53d7350f-ab9a-4dba-a9fe-1a1980008f10","Type":"ContainerDied","Data":"cbea3f37f9efff003d908d9cded1fab328be79733bba00248881444d0529101e"} Apr 16 13:13:51.262703 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.262686 2574 scope.go:117] "RemoveContainer" containerID="cbea3f37f9efff003d908d9cded1fab328be79733bba00248881444d0529101e" Apr 16 13:13:51.277338 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.277284 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-n6nwm" podStartSLOduration=1.880051618 podStartE2EDuration="5.277268646s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="2026-04-16 13:13:46.951080312 +0000 UTC m=+129.641012108" lastFinishedPulling="2026-04-16 13:13:50.348297339 +0000 UTC m=+133.038229136" observedRunningTime="2026-04-16 13:13:51.275817016 +0000 UTC m=+133.965748830" watchObservedRunningTime="2026-04-16 13:13:51.277268646 +0000 UTC m=+133.967200462" Apr 16 13:13:51.292801 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.292749 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" podStartSLOduration=2.028232476 podStartE2EDuration="5.292730335s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="2026-04-16 13:13:47.090217333 +0000 UTC m=+129.780149127" lastFinishedPulling="2026-04-16 13:13:50.354715193 +0000 UTC m=+133.044646986" observedRunningTime="2026-04-16 13:13:51.292728784 +0000 UTC m=+133.982660600" watchObservedRunningTime="2026-04-16 13:13:51.292730335 +0000 UTC m=+133.982662150" Apr 16 13:13:51.377149 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.377085 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" podStartSLOduration=2.195997249 podStartE2EDuration="5.37706428s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="2026-04-16 13:13:47.171655598 +0000 UTC m=+129.861587392" lastFinishedPulling="2026-04-16 13:13:50.352722628 +0000 UTC m=+133.042654423" observedRunningTime="2026-04-16 13:13:51.345215114 +0000 UTC m=+134.035146930" watchObservedRunningTime="2026-04-16 13:13:51.37706428 +0000 UTC m=+134.066996097" Apr 16 13:13:51.377288 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.377232 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4"] Apr 16 13:13:51.380632 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.380609 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4" Apr 16 13:13:51.383806 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.383778 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-l9hhp\"" Apr 16 13:13:51.383931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.383819 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 13:13:51.383931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.383784 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 13:13:51.389948 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.389914 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4"] Apr 16 13:13:51.498339 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.498299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmppb\" (UniqueName: \"kubernetes.io/projected/87dd2329-65a9-4de8-991f-e8d66652bcff-kube-api-access-qmppb\") pod \"migrator-64d4d94569-8hgp4\" (UID: \"87dd2329-65a9-4de8-991f-e8d66652bcff\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4" Apr 16 13:13:51.599685 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.599615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmppb\" (UniqueName: \"kubernetes.io/projected/87dd2329-65a9-4de8-991f-e8d66652bcff-kube-api-access-qmppb\") pod \"migrator-64d4d94569-8hgp4\" (UID: \"87dd2329-65a9-4de8-991f-e8d66652bcff\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4" Apr 16 13:13:51.608642 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.608613 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmppb\" (UniqueName: \"kubernetes.io/projected/87dd2329-65a9-4de8-991f-e8d66652bcff-kube-api-access-qmppb\") pod \"migrator-64d4d94569-8hgp4\" (UID: \"87dd2329-65a9-4de8-991f-e8d66652bcff\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4" Apr 16 13:13:51.692471 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.692421 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4" Apr 16 13:13:51.817859 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:51.817830 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4"] Apr 16 13:13:51.820095 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:13:51.820068 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87dd2329_65a9_4de8_991f_e8d66652bcff.slice/crio-487ff80c069d98d2774b69acc6cffb1cd1afc81bd9dcba80f5d448b92c7b3e43 WatchSource:0}: Error finding container 487ff80c069d98d2774b69acc6cffb1cd1afc81bd9dcba80f5d448b92c7b3e43: Status 404 returned error can't find the container with id 487ff80c069d98d2774b69acc6cffb1cd1afc81bd9dcba80f5d448b92c7b3e43 Apr 16 13:13:52.267026 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:52.267000 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/1.log" Apr 16 13:13:52.267388 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:52.267370 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/0.log" Apr 16 13:13:52.267475 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:52.267410 2574 generic.go:358] "Generic (PLEG): container finished" podID="53d7350f-ab9a-4dba-a9fe-1a1980008f10" containerID="19e1f3dfb8e7813e65643afd25112d0dddaf436c1bf77e766e62040458f6eda0" exitCode=255 Apr 16 13:13:52.267532 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:52.267503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" event={"ID":"53d7350f-ab9a-4dba-a9fe-1a1980008f10","Type":"ContainerDied","Data":"19e1f3dfb8e7813e65643afd25112d0dddaf436c1bf77e766e62040458f6eda0"} Apr 16 13:13:52.267587 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:52.267544 2574 scope.go:117] "RemoveContainer" containerID="cbea3f37f9efff003d908d9cded1fab328be79733bba00248881444d0529101e" Apr 16 13:13:52.267754 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:52.267738 2574 scope.go:117] "RemoveContainer" containerID="19e1f3dfb8e7813e65643afd25112d0dddaf436c1bf77e766e62040458f6eda0" Apr 16 13:13:52.267989 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:52.267970 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-ppgg7_openshift-console-operator(53d7350f-ab9a-4dba-a9fe-1a1980008f10)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" podUID="53d7350f-ab9a-4dba-a9fe-1a1980008f10" Apr 16 13:13:52.268574 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:52.268539 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4" event={"ID":"87dd2329-65a9-4de8-991f-e8d66652bcff","Type":"ContainerStarted","Data":"487ff80c069d98d2774b69acc6cffb1cd1afc81bd9dcba80f5d448b92c7b3e43"} Apr 16 13:13:52.628722 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:52.628642 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rcb6c_2f6415fd-c655-4953-99ef-9575b096b3e2/dns-node-resolver/0.log" Apr 16 13:13:53.273069 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:53.272990 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/1.log" Apr 16 13:13:53.273423 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:53.273398 2574 scope.go:117] "RemoveContainer" containerID="19e1f3dfb8e7813e65643afd25112d0dddaf436c1bf77e766e62040458f6eda0" Apr 16 13:13:53.273657 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:53.273635 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-ppgg7_openshift-console-operator(53d7350f-ab9a-4dba-a9fe-1a1980008f10)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" podUID="53d7350f-ab9a-4dba-a9fe-1a1980008f10" Apr 16 13:13:53.274671 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:53.274653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4" event={"ID":"87dd2329-65a9-4de8-991f-e8d66652bcff","Type":"ContainerStarted","Data":"cd22085d2c61a0b64b7f000aebf760118a99ccf3a74c47f803267f603f1ed5f8"} Apr 16 13:13:53.274754 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:53.274676 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4" event={"ID":"87dd2329-65a9-4de8-991f-e8d66652bcff","Type":"ContainerStarted","Data":"4f3f3511b82c3ced98daddfc4b78b8faa9c25ed7e0ca8722cb650017e927a21e"} Apr 16 13:13:53.306539 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:53.306488 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-8hgp4" podStartSLOduration=1.132658659 podStartE2EDuration="2.306475656s" podCreationTimestamp="2026-04-16 13:13:51 +0000 UTC" firstStartedPulling="2026-04-16 13:13:51.821964868 +0000 UTC m=+134.511896661" lastFinishedPulling="2026-04-16 13:13:52.995781849 +0000 UTC m=+135.685713658" observedRunningTime="2026-04-16 13:13:53.306073772 +0000 UTC m=+135.996005599" watchObservedRunningTime="2026-04-16 13:13:53.306475656 +0000 UTC m=+135.996407477" Apr 16 13:13:53.440343 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:53.440314 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j7cl9_89d8b508-8fec-42a0-b382-7b625093ae71/node-ca/0.log" Apr 16 13:13:54.422747 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:54.422704 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:13:54.423139 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:54.422827 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:13:54.423139 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:54.422840 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-57c969648d-q8lrq: secret "image-registry-tls" not found Apr 16 13:13:54.423139 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:54.422896 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls podName:6b14168a-8ca2-4c5a-a9ba-0238e229d4b6 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:02.422878103 +0000 UTC m=+145.112809896 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls") pod "image-registry-57c969648d-q8lrq" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6") : secret "image-registry-tls" not found Apr 16 13:13:56.969753 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:56.969720 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:56.969753 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:56.969752 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:13:56.970176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:13:56.970091 2574 scope.go:117] "RemoveContainer" containerID="19e1f3dfb8e7813e65643afd25112d0dddaf436c1bf77e766e62040458f6eda0" Apr 16 13:13:56.970268 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:13:56.970249 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-ppgg7_openshift-console-operator(53d7350f-ab9a-4dba-a9fe-1a1980008f10)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" podUID="53d7350f-ab9a-4dba-a9fe-1a1980008f10" Apr 16 13:14:02.483330 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:02.483294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:14:02.485853 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:02.485826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") pod \"image-registry-57c969648d-q8lrq\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:14:02.565664 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:02.565629 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:14:02.691749 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:02.691688 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57c969648d-q8lrq"] Apr 16 13:14:02.694250 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:02.694223 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b14168a_8ca2_4c5a_a9ba_0238e229d4b6.slice/crio-a5c3a372457e2d58d14291d1b3a48a44d6b9b7112f5c05a1974e53632560b6ad WatchSource:0}: Error finding container a5c3a372457e2d58d14291d1b3a48a44d6b9b7112f5c05a1974e53632560b6ad: Status 404 returned error can't find the container with id a5c3a372457e2d58d14291d1b3a48a44d6b9b7112f5c05a1974e53632560b6ad Apr 16 13:14:03.303359 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:03.303321 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" event={"ID":"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6","Type":"ContainerStarted","Data":"c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234"} Apr 16 13:14:03.303577 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:03.303368 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" event={"ID":"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6","Type":"ContainerStarted","Data":"a5c3a372457e2d58d14291d1b3a48a44d6b9b7112f5c05a1974e53632560b6ad"} Apr 16 13:14:03.303577 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:03.303446 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:14:03.323706 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:03.323665 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" podStartSLOduration=17.323650793 podStartE2EDuration="17.323650793s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:14:03.323130163 +0000 UTC m=+146.013061979" watchObservedRunningTime="2026-04-16 13:14:03.323650793 +0000 UTC m=+146.013582609" Apr 16 13:14:10.878720 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:10.878685 2574 scope.go:117] "RemoveContainer" containerID="19e1f3dfb8e7813e65643afd25112d0dddaf436c1bf77e766e62040458f6eda0" Apr 16 13:14:11.325782 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:11.325736 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:14:11.326140 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:11.326124 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/1.log" Apr 16 13:14:11.326198 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:11.326162 2574 generic.go:358] "Generic (PLEG): container finished" podID="53d7350f-ab9a-4dba-a9fe-1a1980008f10" containerID="0b64a99a12f67bad00625c7c9dbb736961e62c4b0abc04607ebd98ef1f1d11ae" exitCode=255 Apr 16 13:14:11.326230 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:11.326211 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" event={"ID":"53d7350f-ab9a-4dba-a9fe-1a1980008f10","Type":"ContainerDied","Data":"0b64a99a12f67bad00625c7c9dbb736961e62c4b0abc04607ebd98ef1f1d11ae"} Apr 16 13:14:11.326273 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:11.326248 2574 scope.go:117] "RemoveContainer" containerID="19e1f3dfb8e7813e65643afd25112d0dddaf436c1bf77e766e62040458f6eda0" Apr 16 13:14:11.326621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:11.326600 2574 scope.go:117] "RemoveContainer" containerID="0b64a99a12f67bad00625c7c9dbb736961e62c4b0abc04607ebd98ef1f1d11ae" Apr 16 13:14:11.326831 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:11.326812 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-ppgg7_openshift-console-operator(53d7350f-ab9a-4dba-a9fe-1a1980008f10)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" podUID="53d7350f-ab9a-4dba-a9fe-1a1980008f10" Apr 16 13:14:12.329438 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:12.329407 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:14:13.640991 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:13.640939 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-5ghzg" podUID="31b5bb3a-e40c-4101-9afe-159f4de5432f" Apr 16 13:14:13.646534 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:13.646499 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vgz9p" podUID="de85bd7a-e5fd-4893-9f0b-d98d2bc4c225" Apr 16 13:14:13.890574 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:13.890534 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-shf7m" podUID="aca4cb2c-c97a-48d4-a12d-3316ad904fc3" Apr 16 13:14:13.936312 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.936278 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m"] Apr 16 13:14:13.939005 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.938979 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" Apr 16 13:14:13.942421 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.942400 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 13:14:13.944025 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.944004 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-9ps54\"" Apr 16 13:14:13.944144 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.944031 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 13:14:13.950083 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.950047 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m"] Apr 16 13:14:13.959601 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.959572 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57c969648d-q8lrq"] Apr 16 13:14:13.962990 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.962967 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4s7w4"] Apr 16 13:14:13.965090 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.965067 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:13.969735 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.969665 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 13:14:13.969735 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.969688 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 13:14:13.969735 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.969701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 13:14:13.969955 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.969698 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 13:14:13.970039 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.970022 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-mhnxk\"" Apr 16 13:14:13.976758 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:13.976736 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4s7w4"] Apr 16 13:14:14.076138 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.076107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-data-volume\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.076336 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.076170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.076336 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.076251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzf5m\" (UniqueName: \"kubernetes.io/projected/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-kube-api-access-fzf5m\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.076336 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.076309 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.076585 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.076336 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-crio-socket\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.076585 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.076402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/19d2435c-5afe-472e-90fc-b47d7e20c7ca-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-szf7m\" (UID: \"19d2435c-5afe-472e-90fc-b47d7e20c7ca\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" Apr 16 13:14:14.076585 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.076473 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19d2435c-5afe-472e-90fc-b47d7e20c7ca-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-szf7m\" (UID: \"19d2435c-5afe-472e-90fc-b47d7e20c7ca\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" Apr 16 13:14:14.177127 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177091 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-data-volume\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.177320 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.177320 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzf5m\" (UniqueName: \"kubernetes.io/projected/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-kube-api-access-fzf5m\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.177320 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.177320 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-crio-socket\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.177320 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/19d2435c-5afe-472e-90fc-b47d7e20c7ca-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-szf7m\" (UID: \"19d2435c-5afe-472e-90fc-b47d7e20c7ca\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" Apr 16 13:14:14.177320 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19d2435c-5afe-472e-90fc-b47d7e20c7ca-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-szf7m\" (UID: \"19d2435c-5afe-472e-90fc-b47d7e20c7ca\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" Apr 16 13:14:14.177655 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177331 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-crio-socket\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.177655 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177484 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-data-volume\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.177798 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.177914 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.177895 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/19d2435c-5afe-472e-90fc-b47d7e20c7ca-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-szf7m\" (UID: \"19d2435c-5afe-472e-90fc-b47d7e20c7ca\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" Apr 16 13:14:14.179628 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.179608 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.179777 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.179760 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/19d2435c-5afe-472e-90fc-b47d7e20c7ca-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-szf7m\" (UID: \"19d2435c-5afe-472e-90fc-b47d7e20c7ca\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" Apr 16 13:14:14.187994 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.187939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzf5m\" (UniqueName: \"kubernetes.io/projected/a2e1eeca-0df2-4cf5-b682-837d6e38c6fe-kube-api-access-fzf5m\") pod \"insights-runtime-extractor-4s7w4\" (UID: \"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe\") " pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.247769 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.247733 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" Apr 16 13:14:14.275245 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.275217 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4s7w4" Apr 16 13:14:14.335334 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.334967 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:14:14.335334 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.334990 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5ghzg" Apr 16 13:14:14.377380 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.377350 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m"] Apr 16 13:14:14.380093 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:14.380066 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d2435c_5afe_472e_90fc_b47d7e20c7ca.slice/crio-f968fd5eae18a9505b9b3bfa2aaadcbb415e7be1cf4a1ae38301d1c51416280d WatchSource:0}: Error finding container f968fd5eae18a9505b9b3bfa2aaadcbb415e7be1cf4a1ae38301d1c51416280d: Status 404 returned error can't find the container with id f968fd5eae18a9505b9b3bfa2aaadcbb415e7be1cf4a1ae38301d1c51416280d Apr 16 13:14:14.403987 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:14.403958 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4s7w4"] Apr 16 13:14:14.406578 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:14.406552 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2e1eeca_0df2_4cf5_b682_837d6e38c6fe.slice/crio-aea1cad2e38959acb9e74984be474af145b18d0ce63fb1986b377a01ade02ca8 WatchSource:0}: Error finding container aea1cad2e38959acb9e74984be474af145b18d0ce63fb1986b377a01ade02ca8: Status 404 returned error can't find the container with id aea1cad2e38959acb9e74984be474af145b18d0ce63fb1986b377a01ade02ca8 Apr 16 13:14:15.338027 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:15.337942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" event={"ID":"19d2435c-5afe-472e-90fc-b47d7e20c7ca","Type":"ContainerStarted","Data":"f968fd5eae18a9505b9b3bfa2aaadcbb415e7be1cf4a1ae38301d1c51416280d"} Apr 16 13:14:15.339431 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:15.339406 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4s7w4" event={"ID":"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe","Type":"ContainerStarted","Data":"f9c3ba2e449b3dcd4ffb08b0829418769971f483dfd4764bcf6c6872c31a1ab5"} Apr 16 13:14:15.339568 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:15.339438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4s7w4" event={"ID":"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe","Type":"ContainerStarted","Data":"28025c98176e361c1fab7e01e2b4b00218287b63b018fa605263eba830fa299f"} Apr 16 13:14:15.339568 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:15.339473 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4s7w4" event={"ID":"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe","Type":"ContainerStarted","Data":"aea1cad2e38959acb9e74984be474af145b18d0ce63fb1986b377a01ade02ca8"} Apr 16 13:14:16.343434 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:16.343395 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" event={"ID":"19d2435c-5afe-472e-90fc-b47d7e20c7ca","Type":"ContainerStarted","Data":"7621e60a2ff88f41f41813e52bb73ad61b718e9a23ee2d50f3455024e277f89c"} Apr 16 13:14:16.358534 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:16.358471 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-szf7m" podStartSLOduration=2.014335159 podStartE2EDuration="3.358437029s" podCreationTimestamp="2026-04-16 13:14:13 +0000 UTC" firstStartedPulling="2026-04-16 13:14:14.382099188 +0000 UTC m=+157.072030982" lastFinishedPulling="2026-04-16 13:14:15.726201055 +0000 UTC m=+158.416132852" observedRunningTime="2026-04-16 13:14:16.357983711 +0000 UTC m=+159.047915527" watchObservedRunningTime="2026-04-16 13:14:16.358437029 +0000 UTC m=+159.048368846" Apr 16 13:14:16.969910 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:16.969857 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:14:16.969910 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:16.969917 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:14:16.970256 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:16.970243 2574 scope.go:117] "RemoveContainer" containerID="0b64a99a12f67bad00625c7c9dbb736961e62c4b0abc04607ebd98ef1f1d11ae" Apr 16 13:14:16.970431 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:16.970413 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-ppgg7_openshift-console-operator(53d7350f-ab9a-4dba-a9fe-1a1980008f10)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" podUID="53d7350f-ab9a-4dba-a9fe-1a1980008f10" Apr 16 13:14:17.347902 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:17.347811 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4s7w4" event={"ID":"a2e1eeca-0df2-4cf5-b682-837d6e38c6fe","Type":"ContainerStarted","Data":"3e61718d72ccff2339d829d29669ffc428f9fd597d6e248ed5cbde688f22089f"} Apr 16 13:14:17.365467 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:17.365395 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4s7w4" podStartSLOduration=2.260848222 podStartE2EDuration="4.365380613s" podCreationTimestamp="2026-04-16 13:14:13 +0000 UTC" firstStartedPulling="2026-04-16 13:14:14.458654529 +0000 UTC m=+157.148586324" lastFinishedPulling="2026-04-16 13:14:16.563186918 +0000 UTC m=+159.253118715" observedRunningTime="2026-04-16 13:14:17.364667209 +0000 UTC m=+160.054599025" watchObservedRunningTime="2026-04-16 13:14:17.365380613 +0000 UTC m=+160.055312428" Apr 16 13:14:18.508652 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.508542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:14:18.508652 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.508634 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:14:18.510865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.510841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/31b5bb3a-e40c-4101-9afe-159f4de5432f-metrics-tls\") pod \"dns-default-5ghzg\" (UID: \"31b5bb3a-e40c-4101-9afe-159f4de5432f\") " pod="openshift-dns/dns-default-5ghzg" Apr 16 13:14:18.510963 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.510870 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de85bd7a-e5fd-4893-9f0b-d98d2bc4c225-cert\") pod \"ingress-canary-vgz9p\" (UID: \"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225\") " pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:14:18.538646 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.538612 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrfd8\"" Apr 16 13:14:18.539962 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.539943 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkkx5\"" Apr 16 13:14:18.546161 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.546140 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5ghzg" Apr 16 13:14:18.546273 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.546209 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vgz9p" Apr 16 13:14:18.689948 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.689920 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vgz9p"] Apr 16 13:14:18.693005 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:18.692973 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde85bd7a_e5fd_4893_9f0b_d98d2bc4c225.slice/crio-51c3c3671bf3868b33b77c78677e66a5395d5c53659a6b233d4ce5056de46c02 WatchSource:0}: Error finding container 51c3c3671bf3868b33b77c78677e66a5395d5c53659a6b233d4ce5056de46c02: Status 404 returned error can't find the container with id 51c3c3671bf3868b33b77c78677e66a5395d5c53659a6b233d4ce5056de46c02 Apr 16 13:14:18.705152 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:18.705129 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5ghzg"] Apr 16 13:14:18.707053 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:18.707032 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31b5bb3a_e40c_4101_9afe_159f4de5432f.slice/crio-1bc2a368fb8b988bc8de6341277e6d2400fd96f32364dc72e607532bb2987fe8 WatchSource:0}: Error finding container 1bc2a368fb8b988bc8de6341277e6d2400fd96f32364dc72e607532bb2987fe8: Status 404 returned error can't find the container with id 1bc2a368fb8b988bc8de6341277e6d2400fd96f32364dc72e607532bb2987fe8 Apr 16 13:14:19.354942 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:19.354902 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5ghzg" event={"ID":"31b5bb3a-e40c-4101-9afe-159f4de5432f","Type":"ContainerStarted","Data":"1bc2a368fb8b988bc8de6341277e6d2400fd96f32364dc72e607532bb2987fe8"} Apr 16 13:14:19.356009 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:19.355980 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vgz9p" event={"ID":"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225","Type":"ContainerStarted","Data":"51c3c3671bf3868b33b77c78677e66a5395d5c53659a6b233d4ce5056de46c02"} Apr 16 13:14:20.360946 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:20.360900 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5ghzg" event={"ID":"31b5bb3a-e40c-4101-9afe-159f4de5432f","Type":"ContainerStarted","Data":"ffe20d75f4662f4f79a049da75e8ca7cdab15a75a72e92800319007975554dd1"} Apr 16 13:14:20.360946 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:20.360939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5ghzg" event={"ID":"31b5bb3a-e40c-4101-9afe-159f4de5432f","Type":"ContainerStarted","Data":"b2abfb9277f4d83233e736f827d86ed87a263f1f17212bd234f6b6f5b13cba1f"} Apr 16 13:14:20.361409 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:20.361042 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5ghzg" Apr 16 13:14:20.379267 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:20.379205 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5ghzg" podStartSLOduration=129.214660861 podStartE2EDuration="2m10.379185542s" podCreationTimestamp="2026-04-16 13:12:10 +0000 UTC" firstStartedPulling="2026-04-16 13:14:18.708820116 +0000 UTC m=+161.398751909" lastFinishedPulling="2026-04-16 13:14:19.873344798 +0000 UTC m=+162.563276590" observedRunningTime="2026-04-16 13:14:20.378608742 +0000 UTC m=+163.068540559" watchObservedRunningTime="2026-04-16 13:14:20.379185542 +0000 UTC m=+163.069117352" Apr 16 13:14:21.075573 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.075536 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx"] Apr 16 13:14:21.077865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.077848 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" Apr 16 13:14:21.080566 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.080538 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 13:14:21.080770 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.080758 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-v9cm6\"" Apr 16 13:14:21.085588 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.085550 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx"] Apr 16 13:14:21.126430 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.126406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8be2c403-d308-4c1d-83f1-374a4c40788d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-9mprx\" (UID: \"8be2c403-d308-4c1d-83f1-374a4c40788d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" Apr 16 13:14:21.227544 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.227492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8be2c403-d308-4c1d-83f1-374a4c40788d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-9mprx\" (UID: \"8be2c403-d308-4c1d-83f1-374a4c40788d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" Apr 16 13:14:21.227832 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:21.227619 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 13:14:21.227832 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:21.227687 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8be2c403-d308-4c1d-83f1-374a4c40788d-tls-certificates podName:8be2c403-d308-4c1d-83f1-374a4c40788d nodeName:}" failed. No retries permitted until 2026-04-16 13:14:21.727670756 +0000 UTC m=+164.417602549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/8be2c403-d308-4c1d-83f1-374a4c40788d-tls-certificates") pod "prometheus-operator-admission-webhook-9cb97cd87-9mprx" (UID: "8be2c403-d308-4c1d-83f1-374a4c40788d") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 13:14:21.365050 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.364957 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vgz9p" event={"ID":"de85bd7a-e5fd-4893-9f0b-d98d2bc4c225","Type":"ContainerStarted","Data":"1e104e287154fbf9e732065bc645cccc46ff824223f61d88b386948af71c3042"} Apr 16 13:14:21.380786 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.380739 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vgz9p" podStartSLOduration=128.958526769 podStartE2EDuration="2m11.380723686s" podCreationTimestamp="2026-04-16 13:12:10 +0000 UTC" firstStartedPulling="2026-04-16 13:14:18.694913786 +0000 UTC m=+161.384845583" lastFinishedPulling="2026-04-16 13:14:21.117110689 +0000 UTC m=+163.807042500" observedRunningTime="2026-04-16 13:14:21.379826657 +0000 UTC m=+164.069758472" watchObservedRunningTime="2026-04-16 13:14:21.380723686 +0000 UTC m=+164.070655500" Apr 16 13:14:21.731707 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.731672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8be2c403-d308-4c1d-83f1-374a4c40788d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-9mprx\" (UID: \"8be2c403-d308-4c1d-83f1-374a4c40788d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" Apr 16 13:14:21.734084 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.734062 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8be2c403-d308-4c1d-83f1-374a4c40788d-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-9mprx\" (UID: \"8be2c403-d308-4c1d-83f1-374a4c40788d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" Apr 16 13:14:21.987644 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:21.987538 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" Apr 16 13:14:22.105810 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:22.105783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx"] Apr 16 13:14:22.108148 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:22.108122 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8be2c403_d308_4c1d_83f1_374a4c40788d.slice/crio-18fcf103011a991a426b7cbd22a8adfe294dfdbd8ace77f2ea36d9dc35fbfd05 WatchSource:0}: Error finding container 18fcf103011a991a426b7cbd22a8adfe294dfdbd8ace77f2ea36d9dc35fbfd05: Status 404 returned error can't find the container with id 18fcf103011a991a426b7cbd22a8adfe294dfdbd8ace77f2ea36d9dc35fbfd05 Apr 16 13:14:22.369642 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:22.369547 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" event={"ID":"8be2c403-d308-4c1d-83f1-374a4c40788d","Type":"ContainerStarted","Data":"18fcf103011a991a426b7cbd22a8adfe294dfdbd8ace77f2ea36d9dc35fbfd05"} Apr 16 13:14:23.965256 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:23.965229 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:14:24.378868 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:24.378784 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" event={"ID":"8be2c403-d308-4c1d-83f1-374a4c40788d","Type":"ContainerStarted","Data":"638a76c30b158b445a8c1fb5f10cb788924940472704de02a6bd6d616b9c4e62"} Apr 16 13:14:24.379017 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:24.378981 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" Apr 16 13:14:24.384657 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:24.384631 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" Apr 16 13:14:24.393668 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:24.393627 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-9mprx" podStartSLOduration=2.192222602 podStartE2EDuration="3.39361467s" podCreationTimestamp="2026-04-16 13:14:21 +0000 UTC" firstStartedPulling="2026-04-16 13:14:22.110160343 +0000 UTC m=+164.800092136" lastFinishedPulling="2026-04-16 13:14:23.311552397 +0000 UTC m=+166.001484204" observedRunningTime="2026-04-16 13:14:24.393023768 +0000 UTC m=+167.082955602" watchObservedRunningTime="2026-04-16 13:14:24.39361467 +0000 UTC m=+167.083546484" Apr 16 13:14:27.880034 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:27.880000 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:14:30.367605 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.367568 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5ghzg" Apr 16 13:14:30.477883 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.477851 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-8dckj"] Apr 16 13:14:30.481115 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.481097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.484213 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.484182 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 13:14:30.484343 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.484210 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 13:14:30.484343 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.484182 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-pbdtp\"" Apr 16 13:14:30.484343 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.484291 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 13:14:30.484343 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.484297 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 13:14:30.484545 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.484522 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 13:14:30.491728 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.491706 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-8dckj"] Apr 16 13:14:30.495942 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.495920 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-rqxwg"] Apr 16 13:14:30.501336 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.501312 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.504267 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.504247 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 13:14:30.504369 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.504292 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 13:14:30.504369 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.504328 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 13:14:30.504369 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.504292 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2q4p2\"" Apr 16 13:14:30.594658 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594621 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-metrics-client-ca\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.594658 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594661 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r6np\" (UniqueName: \"kubernetes.io/projected/5feeb15e-917e-4399-bb74-32d60a57e6a9-kube-api-access-6r6np\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.594876 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594681 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-root\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.594876 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594735 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-accelerators-collector-config\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.594876 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-tls\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.594876 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-sys\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.594876 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5feeb15e-917e-4399-bb74-32d60a57e6a9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.594876 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594858 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-wtmp\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.594876 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kncgn\" (UniqueName: \"kubernetes.io/projected/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-kube-api-access-kncgn\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.595112 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-textfile\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.595112 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594913 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5feeb15e-917e-4399-bb74-32d60a57e6a9-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.595112 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594937 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.595112 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.594970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5feeb15e-917e-4399-bb74-32d60a57e6a9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.695907 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.695861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5feeb15e-917e-4399-bb74-32d60a57e6a9-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.695907 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.695912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.695956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5feeb15e-917e-4399-bb74-32d60a57e6a9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.696162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.695994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-metrics-client-ca\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6r6np\" (UniqueName: \"kubernetes.io/projected/5feeb15e-917e-4399-bb74-32d60a57e6a9-kube-api-access-6r6np\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.696162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-root\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696162 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-accelerators-collector-config\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696401 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696165 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-root\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696401 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696337 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-tls\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696523 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696418 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-sys\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696523 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-sys\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696523 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5feeb15e-917e-4399-bb74-32d60a57e6a9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.696943 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:30.696421 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 13:14:30.696943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-wtmp\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696593 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kncgn\" (UniqueName: \"kubernetes.io/projected/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-kube-api-access-kncgn\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696943 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:30.696614 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-tls podName:abf89fc6-7379-4a8b-b65c-7d612bfa9e16 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:31.19659198 +0000 UTC m=+173.886523793 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-tls") pod "node-exporter-rqxwg" (UID: "abf89fc6-7379-4a8b-b65c-7d612bfa9e16") : secret "node-exporter-tls" not found Apr 16 13:14:30.696943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-textfile\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5feeb15e-917e-4399-bb74-32d60a57e6a9-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.696943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696735 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-wtmp\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696753 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-metrics-client-ca\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.696943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.696916 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-textfile\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.697343 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.697198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-accelerators-collector-config\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.698426 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.698402 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.698630 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.698611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5feeb15e-917e-4399-bb74-32d60a57e6a9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.698887 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.698868 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5feeb15e-917e-4399-bb74-32d60a57e6a9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.707520 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.707497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r6np\" (UniqueName: \"kubernetes.io/projected/5feeb15e-917e-4399-bb74-32d60a57e6a9-kube-api-access-6r6np\") pod \"openshift-state-metrics-5669946b84-8dckj\" (UID: \"5feeb15e-917e-4399-bb74-32d60a57e6a9\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.708983 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.708966 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kncgn\" (UniqueName: \"kubernetes.io/projected/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-kube-api-access-kncgn\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:30.789922 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.789883 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" Apr 16 13:14:30.909613 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:30.909591 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-8dckj"] Apr 16 13:14:30.911800 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:30.911772 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5feeb15e_917e_4399_bb74_32d60a57e6a9.slice/crio-e6aadd9e73126b2af6799318519737cccef787a0d8019c7805bd2dcc08891303 WatchSource:0}: Error finding container e6aadd9e73126b2af6799318519737cccef787a0d8019c7805bd2dcc08891303: Status 404 returned error can't find the container with id e6aadd9e73126b2af6799318519737cccef787a0d8019c7805bd2dcc08891303 Apr 16 13:14:31.203615 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.203512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-tls\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:31.205797 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.205767 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/abf89fc6-7379-4a8b-b65c-7d612bfa9e16-node-exporter-tls\") pod \"node-exporter-rqxwg\" (UID: \"abf89fc6-7379-4a8b-b65c-7d612bfa9e16\") " pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:31.398709 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.398677 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" event={"ID":"5feeb15e-917e-4399-bb74-32d60a57e6a9","Type":"ContainerStarted","Data":"64c3f6d943aedf8580aa713a72f1c6b0f178750a2120a0a6d2cf9972319a3caf"} Apr 16 13:14:31.398709 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.398712 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" event={"ID":"5feeb15e-917e-4399-bb74-32d60a57e6a9","Type":"ContainerStarted","Data":"f0d5cc3dcf78d2b15c5146090bae3e235193cb31e66d6a66a5782afe7ada4b37"} Apr 16 13:14:31.399106 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.398724 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" event={"ID":"5feeb15e-917e-4399-bb74-32d60a57e6a9","Type":"ContainerStarted","Data":"e6aadd9e73126b2af6799318519737cccef787a0d8019c7805bd2dcc08891303"} Apr 16 13:14:31.409958 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.409937 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-rqxwg" Apr 16 13:14:31.418323 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:31.418289 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabf89fc6_7379_4a8b_b65c_7d612bfa9e16.slice/crio-16dfdb0af22e7cdd6971d77d9a45173cc1b6a94122a45c971bf3157cc601c39f WatchSource:0}: Error finding container 16dfdb0af22e7cdd6971d77d9a45173cc1b6a94122a45c971bf3157cc601c39f: Status 404 returned error can't find the container with id 16dfdb0af22e7cdd6971d77d9a45173cc1b6a94122a45c971bf3157cc601c39f Apr 16 13:14:31.565968 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.565884 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 13:14:31.570480 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.570443 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.573666 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.573644 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 13:14:31.573881 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.573651 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7dhhl\"" Apr 16 13:14:31.573881 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.573767 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 13:14:31.573881 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.573772 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 13:14:31.573881 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.573843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 13:14:31.574098 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.574052 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 13:14:31.574381 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.574353 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 13:14:31.574381 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.574373 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 13:14:31.574537 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.574356 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 13:14:31.574537 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.574516 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 13:14:31.590465 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.590430 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 13:14:31.606184 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-volume\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606276 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-web-config\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606276 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606347 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606347 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606336 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606409 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606409 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606380 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606409 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606400 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606519 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606441 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606519 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606467 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmvz9\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-kube-api-access-hmvz9\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606519 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606494 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606610 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.606610 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.606549 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-out\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707153 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707120 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707331 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707169 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707331 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707209 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707331 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707331 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707331 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707373 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmvz9\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-kube-api-access-hmvz9\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707449 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-out\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-volume\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.707563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-web-config\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.707947 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:31.707907 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-trusted-ca-bundle podName:20f015bb-b55d-4cf1-b343-e98727d59a5b nodeName:}" failed. No retries permitted until 2026-04-16 13:14:32.207885351 +0000 UTC m=+174.897817168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b") : configmap references non-existent config key: ca-bundle.crt Apr 16 13:14:31.708492 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.708441 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.709051 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.708724 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.709051 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:31.708809 2574 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 13:14:31.709051 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:31.708905 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-main-tls podName:20f015bb-b55d-4cf1-b343-e98727d59a5b nodeName:}" failed. No retries permitted until 2026-04-16 13:14:32.208886579 +0000 UTC m=+174.898818372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b") : secret "alertmanager-main-tls" not found Apr 16 13:14:31.711924 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.711903 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.712120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.711926 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-volume\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.712120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.712094 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.712342 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.712160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.712342 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.712335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-web-config\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.712473 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.712416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-tls-assets\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.712548 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.712525 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-out\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.713592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.713574 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.722492 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.722444 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmvz9\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-kube-api-access-hmvz9\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:31.879586 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:31.879511 2574 scope.go:117] "RemoveContainer" containerID="0b64a99a12f67bad00625c7c9dbb736961e62c4b0abc04607ebd98ef1f1d11ae" Apr 16 13:14:32.214196 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.214128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:32.214385 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.214217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:32.216425 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.216391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:32.217077 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.217052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:32.407248 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.405136 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:14:32.407248 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.405229 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" event={"ID":"53d7350f-ab9a-4dba-a9fe-1a1980008f10","Type":"ContainerStarted","Data":"a8233bdc780a84f14e9e0199c1589a2fef1e38a4e92293671608d4a15c5550aa"} Apr 16 13:14:32.407248 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.406174 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:14:32.409634 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.409601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rqxwg" event={"ID":"abf89fc6-7379-4a8b-b65c-7d612bfa9e16","Type":"ContainerStarted","Data":"16dfdb0af22e7cdd6971d77d9a45173cc1b6a94122a45c971bf3157cc601c39f"} Apr 16 13:14:32.412242 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.412220 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" Apr 16 13:14:32.426844 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.426808 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-ppgg7" podStartSLOduration=43.18255341 podStartE2EDuration="46.426793374s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="2026-04-16 13:13:47.105011891 +0000 UTC m=+129.794943684" lastFinishedPulling="2026-04-16 13:13:50.34925184 +0000 UTC m=+133.039183648" observedRunningTime="2026-04-16 13:14:32.425936738 +0000 UTC m=+175.115868556" watchObservedRunningTime="2026-04-16 13:14:32.426793374 +0000 UTC m=+175.116725190" Apr 16 13:14:32.457574 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.457500 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-t7qss"] Apr 16 13:14:32.462101 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.462080 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-t7qss" Apr 16 13:14:32.465855 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.465784 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 13:14:32.466323 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.466162 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-dpxwm\"" Apr 16 13:14:32.466323 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.466186 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 13:14:32.478259 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.478222 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-t7qss"] Apr 16 13:14:32.479906 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.479855 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:14:32.520251 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.520128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj2km\" (UniqueName: \"kubernetes.io/projected/ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e-kube-api-access-vj2km\") pod \"downloads-586b57c7b4-t7qss\" (UID: \"ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e\") " pod="openshift-console/downloads-586b57c7b4-t7qss" Apr 16 13:14:32.566782 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.566663 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7f878f577b-4q66m"] Apr 16 13:14:32.575658 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.575621 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.578846 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.578818 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fpch9nnmhi548\"" Apr 16 13:14:32.578988 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.578889 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 13:14:32.579438 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.579101 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 13:14:32.579438 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.578818 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 13:14:32.579438 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.579166 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 13:14:32.579438 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.579398 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 13:14:32.579793 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.579679 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-5jkvn\"" Apr 16 13:14:32.582685 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.582593 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f878f577b-4q66m"] Apr 16 13:14:32.620948 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.620921 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfb03555-ea15-4b8c-a044-56687ec8276a-metrics-client-ca\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.621086 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.620953 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhqw5\" (UniqueName: \"kubernetes.io/projected/cfb03555-ea15-4b8c-a044-56687ec8276a-kube-api-access-nhqw5\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.621086 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.620985 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.621086 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.621021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-tls\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.621217 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.621101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-grpc-tls\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.621217 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.621152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj2km\" (UniqueName: \"kubernetes.io/projected/ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e-kube-api-access-vj2km\") pod \"downloads-586b57c7b4-t7qss\" (UID: \"ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e\") " pod="openshift-console/downloads-586b57c7b4-t7qss" Apr 16 13:14:32.621314 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.621226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.621314 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.621285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.621401 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.621321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.627209 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.627186 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 13:14:32.633083 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.633058 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj2km\" (UniqueName: \"kubernetes.io/projected/ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e-kube-api-access-vj2km\") pod \"downloads-586b57c7b4-t7qss\" (UID: \"ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e\") " pod="openshift-console/downloads-586b57c7b4-t7qss" Apr 16 13:14:32.662441 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:32.662405 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f015bb_b55d_4cf1_b343_e98727d59a5b.slice/crio-a4ecc909b5f84cd3244c4a7c424fcf1c5395335f3327a52e8051a7dc0a627c52 WatchSource:0}: Error finding container a4ecc909b5f84cd3244c4a7c424fcf1c5395335f3327a52e8051a7dc0a627c52: Status 404 returned error can't find the container with id a4ecc909b5f84cd3244c4a7c424fcf1c5395335f3327a52e8051a7dc0a627c52 Apr 16 13:14:32.721993 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.721908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.721993 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.721953 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-tls\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.721993 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.721989 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-grpc-tls\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.722232 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.722036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.722288 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.722218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.722288 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.722260 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.722387 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.722328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfb03555-ea15-4b8c-a044-56687ec8276a-metrics-client-ca\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.722439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.722408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhqw5\" (UniqueName: \"kubernetes.io/projected/cfb03555-ea15-4b8c-a044-56687ec8276a-kube-api-access-nhqw5\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.723995 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.723964 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfb03555-ea15-4b8c-a044-56687ec8276a-metrics-client-ca\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.724767 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.724747 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-tls\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.724970 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.724949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-grpc-tls\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.725011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.724983 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.725121 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.725101 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.725159 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.725102 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.725159 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.725114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cfb03555-ea15-4b8c-a044-56687ec8276a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.729628 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.729607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhqw5\" (UniqueName: \"kubernetes.io/projected/cfb03555-ea15-4b8c-a044-56687ec8276a-kube-api-access-nhqw5\") pod \"thanos-querier-7f878f577b-4q66m\" (UID: \"cfb03555-ea15-4b8c-a044-56687ec8276a\") " pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.779253 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.779210 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-t7qss" Apr 16 13:14:32.891432 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.891393 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:32.902050 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:32.902026 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-t7qss"] Apr 16 13:14:32.904266 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:32.904234 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podace8c1d3_a5ca_44e1_aeb7_2c82fe2b253e.slice/crio-c1003b9acd115dd40fd710fa9db7bb1458083f5797b1c1ba90dac431b707d41c WatchSource:0}: Error finding container c1003b9acd115dd40fd710fa9db7bb1458083f5797b1c1ba90dac431b707d41c: Status 404 returned error can't find the container with id c1003b9acd115dd40fd710fa9db7bb1458083f5797b1c1ba90dac431b707d41c Apr 16 13:14:33.017446 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:33.017361 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7f878f577b-4q66m"] Apr 16 13:14:33.020485 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:33.020437 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb03555_ea15_4b8c_a044_56687ec8276a.slice/crio-58d46c9973cbc50d052b696520c11a6b8064f6cda1556eb1c2322a8364f5669b WatchSource:0}: Error finding container 58d46c9973cbc50d052b696520c11a6b8064f6cda1556eb1c2322a8364f5669b: Status 404 returned error can't find the container with id 58d46c9973cbc50d052b696520c11a6b8064f6cda1556eb1c2322a8364f5669b Apr 16 13:14:33.415548 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:33.415362 2574 generic.go:358] "Generic (PLEG): container finished" podID="abf89fc6-7379-4a8b-b65c-7d612bfa9e16" containerID="5c62b4e927861160eefdedd42207c00775f2a2ac750e3f6590bd0cc972ec4b0d" exitCode=0 Apr 16 13:14:33.415548 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:33.415505 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rqxwg" event={"ID":"abf89fc6-7379-4a8b-b65c-7d612bfa9e16","Type":"ContainerDied","Data":"5c62b4e927861160eefdedd42207c00775f2a2ac750e3f6590bd0cc972ec4b0d"} Apr 16 13:14:33.418643 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:33.418612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" event={"ID":"5feeb15e-917e-4399-bb74-32d60a57e6a9","Type":"ContainerStarted","Data":"43a9957ea70ef76ddbd3a47fb19aa9c6a314f1b74044a6f8fa277cf8b86ef2ac"} Apr 16 13:14:33.420170 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:33.420126 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" event={"ID":"cfb03555-ea15-4b8c-a044-56687ec8276a","Type":"ContainerStarted","Data":"58d46c9973cbc50d052b696520c11a6b8064f6cda1556eb1c2322a8364f5669b"} Apr 16 13:14:33.421930 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:33.421904 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-t7qss" event={"ID":"ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e","Type":"ContainerStarted","Data":"c1003b9acd115dd40fd710fa9db7bb1458083f5797b1c1ba90dac431b707d41c"} Apr 16 13:14:33.423701 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:33.423627 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerStarted","Data":"a4ecc909b5f84cd3244c4a7c424fcf1c5395335f3327a52e8051a7dc0a627c52"} Apr 16 13:14:33.458276 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:33.458071 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-8dckj" podStartSLOduration=2.059596532 podStartE2EDuration="3.458051792s" podCreationTimestamp="2026-04-16 13:14:30 +0000 UTC" firstStartedPulling="2026-04-16 13:14:31.024186787 +0000 UTC m=+173.714118580" lastFinishedPulling="2026-04-16 13:14:32.422642038 +0000 UTC m=+175.112573840" observedRunningTime="2026-04-16 13:14:33.456673301 +0000 UTC m=+176.146605129" watchObservedRunningTime="2026-04-16 13:14:33.458051792 +0000 UTC m=+176.147983609" Apr 16 13:14:34.428786 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:34.428747 2574 generic.go:358] "Generic (PLEG): container finished" podID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerID="39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180" exitCode=0 Apr 16 13:14:34.429226 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:34.428865 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerDied","Data":"39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180"} Apr 16 13:14:34.432212 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:34.432184 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rqxwg" event={"ID":"abf89fc6-7379-4a8b-b65c-7d612bfa9e16","Type":"ContainerStarted","Data":"a6f529a56444abd57d97fc840ab991add2c32082e9e3c2249ac3f87675fe03a5"} Apr 16 13:14:34.432326 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:34.432223 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-rqxwg" event={"ID":"abf89fc6-7379-4a8b-b65c-7d612bfa9e16","Type":"ContainerStarted","Data":"b52bd470f25466a365761d278bdb06acafe4701da14ed3cf4844274186c1510a"} Apr 16 13:14:34.476379 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:34.476301 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-rqxwg" podStartSLOduration=3.475716798 podStartE2EDuration="4.476280811s" podCreationTimestamp="2026-04-16 13:14:30 +0000 UTC" firstStartedPulling="2026-04-16 13:14:31.420178679 +0000 UTC m=+174.110110472" lastFinishedPulling="2026-04-16 13:14:32.420742679 +0000 UTC m=+175.110674485" observedRunningTime="2026-04-16 13:14:34.474486336 +0000 UTC m=+177.164418149" watchObservedRunningTime="2026-04-16 13:14:34.476280811 +0000 UTC m=+177.166212627" Apr 16 13:14:35.262047 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.261406 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w"] Apr 16 13:14:35.265684 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.265288 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" Apr 16 13:14:35.268004 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.267979 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 13:14:35.268125 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.267992 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-szp2v\"" Apr 16 13:14:35.272303 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.272255 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w"] Apr 16 13:14:35.348611 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.348490 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4db29eaf-1248-4f16-9c89-f07661408149-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-lnb6w\" (UID: \"4db29eaf-1248-4f16-9c89-f07661408149\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" Apr 16 13:14:35.450012 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.449887 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4db29eaf-1248-4f16-9c89-f07661408149-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-lnb6w\" (UID: \"4db29eaf-1248-4f16-9c89-f07661408149\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" Apr 16 13:14:35.455481 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.455406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4db29eaf-1248-4f16-9c89-f07661408149-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-lnb6w\" (UID: \"4db29eaf-1248-4f16-9c89-f07661408149\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" Apr 16 13:14:35.578857 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.578761 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" Apr 16 13:14:35.724294 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:35.724267 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w"] Apr 16 13:14:36.082337 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:36.082309 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4db29eaf_1248_4f16_9c89_f07661408149.slice/crio-f87ba76efc4dabb50eb0d7f95ea0f347f39ae58cc547c405932808b223a34407 WatchSource:0}: Error finding container f87ba76efc4dabb50eb0d7f95ea0f347f39ae58cc547c405932808b223a34407: Status 404 returned error can't find the container with id f87ba76efc4dabb50eb0d7f95ea0f347f39ae58cc547c405932808b223a34407 Apr 16 13:14:36.441170 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:36.441132 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" event={"ID":"4db29eaf-1248-4f16-9c89-f07661408149","Type":"ContainerStarted","Data":"f87ba76efc4dabb50eb0d7f95ea0f347f39ae58cc547c405932808b223a34407"} Apr 16 13:14:36.443512 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:36.443482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" event={"ID":"cfb03555-ea15-4b8c-a044-56687ec8276a","Type":"ContainerStarted","Data":"3e3f699f39f05cc7e3517cdde848711e588ba905bcf1a51b69eeda3190b3df9c"} Apr 16 13:14:36.443644 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:36.443524 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" event={"ID":"cfb03555-ea15-4b8c-a044-56687ec8276a","Type":"ContainerStarted","Data":"a5830f59015d7ceec89f28922ce5be5615b4ddc2780b92ef07850886490343a2"} Apr 16 13:14:36.443644 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:36.443539 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" event={"ID":"cfb03555-ea15-4b8c-a044-56687ec8276a","Type":"ContainerStarted","Data":"341c3951023cea7e8f1c7d503f33d69433451f35c0a5b0be8bb8a4a9c49c58f6"} Apr 16 13:14:36.445872 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:36.445848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerStarted","Data":"b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769"} Apr 16 13:14:36.445990 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:36.445878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerStarted","Data":"523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596"} Apr 16 13:14:36.445990 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:36.445897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerStarted","Data":"caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd"} Apr 16 13:14:37.452677 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:37.452641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerStarted","Data":"a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24"} Apr 16 13:14:37.452677 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:37.452679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerStarted","Data":"5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b"} Apr 16 13:14:38.458705 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.458664 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" event={"ID":"cfb03555-ea15-4b8c-a044-56687ec8276a","Type":"ContainerStarted","Data":"7f7ce0b18ad28f37dde32d686c9dac095798c35d46c2cab6e13c21e6ff14d249"} Apr 16 13:14:38.458705 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.458705 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" event={"ID":"cfb03555-ea15-4b8c-a044-56687ec8276a","Type":"ContainerStarted","Data":"4eb3bd9fbd93644bc26c0d9a51ef6e8171b4f7a3836d2542d46740dce5a832f3"} Apr 16 13:14:38.458705 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.458715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" event={"ID":"cfb03555-ea15-4b8c-a044-56687ec8276a","Type":"ContainerStarted","Data":"29c42f9ab320b2597d1cf0b80b8068a4265d3cef4dd9affd02eeefd5dae4c23c"} Apr 16 13:14:38.459273 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.458863 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:38.462110 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.462080 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerStarted","Data":"01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13"} Apr 16 13:14:38.463810 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.463773 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" event={"ID":"4db29eaf-1248-4f16-9c89-f07661408149","Type":"ContainerStarted","Data":"9d678db3e7bc58e8262c0f022d0a033639be7ffb78f85caa0a2a17e7d305664c"} Apr 16 13:14:38.464047 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.464024 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" Apr 16 13:14:38.469570 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.469546 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" Apr 16 13:14:38.482761 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.482692 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" podStartSLOduration=1.977975806 podStartE2EDuration="6.482672976s" podCreationTimestamp="2026-04-16 13:14:32 +0000 UTC" firstStartedPulling="2026-04-16 13:14:33.02235105 +0000 UTC m=+175.712282850" lastFinishedPulling="2026-04-16 13:14:37.527048183 +0000 UTC m=+180.216980020" observedRunningTime="2026-04-16 13:14:38.480049365 +0000 UTC m=+181.169981183" watchObservedRunningTime="2026-04-16 13:14:38.482672976 +0000 UTC m=+181.172604792" Apr 16 13:14:38.494849 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.494799 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-lnb6w" podStartSLOduration=2.006058492 podStartE2EDuration="3.494782999s" podCreationTimestamp="2026-04-16 13:14:35 +0000 UTC" firstStartedPulling="2026-04-16 13:14:36.084676175 +0000 UTC m=+178.774607973" lastFinishedPulling="2026-04-16 13:14:37.573400684 +0000 UTC m=+180.263332480" observedRunningTime="2026-04-16 13:14:38.494102861 +0000 UTC m=+181.184034677" watchObservedRunningTime="2026-04-16 13:14:38.494782999 +0000 UTC m=+181.184714814" Apr 16 13:14:38.525168 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.525112 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.6616935010000002 podStartE2EDuration="7.525096342s" podCreationTimestamp="2026-04-16 13:14:31 +0000 UTC" firstStartedPulling="2026-04-16 13:14:32.664321555 +0000 UTC m=+175.354253349" lastFinishedPulling="2026-04-16 13:14:37.527724385 +0000 UTC m=+180.217656190" observedRunningTime="2026-04-16 13:14:38.523183668 +0000 UTC m=+181.213115486" watchObservedRunningTime="2026-04-16 13:14:38.525096342 +0000 UTC m=+181.215028218" Apr 16 13:14:38.980470 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:38.980381 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" podUID="6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" containerName="registry" containerID="cri-o://c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234" gracePeriod=30 Apr 16 13:14:39.250301 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.250276 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:14:39.387592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.387554 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-bound-sa-token\") pod \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " Apr 16 13:14:39.387592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.387604 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-certificates\") pod \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " Apr 16 13:14:39.387811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.387627 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-installation-pull-secrets\") pod \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " Apr 16 13:14:39.387811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.387673 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-ca-trust-extracted\") pod \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " Apr 16 13:14:39.387811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.387709 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-trusted-ca\") pod \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " Apr 16 13:14:39.387811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.387739 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgcfq\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-kube-api-access-lgcfq\") pod \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " Apr 16 13:14:39.387811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.387767 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-image-registry-private-configuration\") pod \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " Apr 16 13:14:39.387811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.387791 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") pod \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\" (UID: \"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6\") " Apr 16 13:14:39.388263 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.388091 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:14:39.388263 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.388156 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:14:39.390268 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.390236 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:14:39.390376 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.390285 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:14:39.390376 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.390305 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:14:39.390514 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.390429 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:14:39.390622 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.390601 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-kube-api-access-lgcfq" (OuterVolumeSpecName: "kube-api-access-lgcfq") pod "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6"). InnerVolumeSpecName "kube-api-access-lgcfq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:14:39.396701 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.396667 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" (UID: "6b14168a-8ca2-4c5a-a9ba-0238e229d4b6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:14:39.468330 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.468295 2574 generic.go:358] "Generic (PLEG): container finished" podID="6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" containerID="c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234" exitCode=0 Apr 16 13:14:39.468832 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.468371 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" Apr 16 13:14:39.468832 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.468401 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" event={"ID":"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6","Type":"ContainerDied","Data":"c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234"} Apr 16 13:14:39.468832 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.468437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-57c969648d-q8lrq" event={"ID":"6b14168a-8ca2-4c5a-a9ba-0238e229d4b6","Type":"ContainerDied","Data":"a5c3a372457e2d58d14291d1b3a48a44d6b9b7112f5c05a1974e53632560b6ad"} Apr 16 13:14:39.468832 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.468475 2574 scope.go:117] "RemoveContainer" containerID="c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234" Apr 16 13:14:39.479167 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.479145 2574 scope.go:117] "RemoveContainer" containerID="c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234" Apr 16 13:14:39.479549 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:14:39.479519 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234\": container with ID starting with c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234 not found: ID does not exist" containerID="c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234" Apr 16 13:14:39.479637 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.479564 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234"} err="failed to get container status \"c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234\": rpc error: code = NotFound desc = could not find container \"c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234\": container with ID starting with c633e1fa377516aa275f9353636645652234463aa05e2c5a50fae6c9b73e7234 not found: ID does not exist" Apr 16 13:14:39.488988 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.488909 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-installation-pull-secrets\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:14:39.488988 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.488941 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-ca-trust-extracted\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:14:39.488988 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.488956 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-trusted-ca\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:14:39.488988 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.488970 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lgcfq\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-kube-api-access-lgcfq\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:14:39.488988 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.488982 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-image-registry-private-configuration\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:14:39.488988 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.488991 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-tls\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:14:39.489352 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.488999 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-bound-sa-token\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:14:39.489352 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.489008 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6-registry-certificates\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:14:39.501142 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.501099 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57c969648d-q8lrq"] Apr 16 13:14:39.506241 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.506205 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-57c969648d-q8lrq"] Apr 16 13:14:39.883414 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:39.883323 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" path="/var/lib/kubelet/pods/6b14168a-8ca2-4c5a-a9ba-0238e229d4b6/volumes" Apr 16 13:14:44.016207 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.016175 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5cfb58c84-fd7rt"] Apr 16 13:14:44.016801 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.016635 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" containerName="registry" Apr 16 13:14:44.016801 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.016652 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" containerName="registry" Apr 16 13:14:44.016801 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.016741 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b14168a-8ca2-4c5a-a9ba-0238e229d4b6" containerName="registry" Apr 16 13:14:44.022142 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.022119 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.025007 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.024949 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 13:14:44.026215 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.026162 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 13:14:44.026500 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.026479 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-tcftn\"" Apr 16 13:14:44.026500 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.026494 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 13:14:44.026677 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.026511 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 13:14:44.026677 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.026541 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cfb58c84-fd7rt"] Apr 16 13:14:44.026677 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.026484 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 13:14:44.129890 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.129848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-oauth-config\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.130121 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.129903 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-console-config\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.130121 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.129950 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-oauth-serving-cert\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.130121 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.130017 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-service-ca\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.130121 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.130037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w4ph\" (UniqueName: \"kubernetes.io/projected/9241f232-bf65-44f0-8074-704d349dc8db-kube-api-access-8w4ph\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.130121 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.130055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-serving-cert\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.230492 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.230431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-console-config\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.230492 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.230491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-oauth-serving-cert\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.230728 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.230535 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-service-ca\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.230728 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.230551 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w4ph\" (UniqueName: \"kubernetes.io/projected/9241f232-bf65-44f0-8074-704d349dc8db-kube-api-access-8w4ph\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.230728 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.230576 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-serving-cert\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.230942 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.230912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-oauth-config\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.231285 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.231231 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-console-config\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.231394 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.231290 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-service-ca\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.231481 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.231443 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-oauth-serving-cert\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.233450 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.233428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-serving-cert\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.233555 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.233499 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-oauth-config\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.239323 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.239299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w4ph\" (UniqueName: \"kubernetes.io/projected/9241f232-bf65-44f0-8074-704d349dc8db-kube-api-access-8w4ph\") pod \"console-5cfb58c84-fd7rt\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.335022 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.334927 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:14:44.475789 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.475601 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5cfb58c84-fd7rt"] Apr 16 13:14:44.476497 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.476479 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7f878f577b-4q66m" Apr 16 13:14:44.480991 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:44.480961 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9241f232_bf65_44f0_8074_704d349dc8db.slice/crio-06763fadd18e4cf78c027b398b54511e7ae988190ba93f1fd54fead6accdc9c7 WatchSource:0}: Error finding container 06763fadd18e4cf78c027b398b54511e7ae988190ba93f1fd54fead6accdc9c7: Status 404 returned error can't find the container with id 06763fadd18e4cf78c027b398b54511e7ae988190ba93f1fd54fead6accdc9c7 Apr 16 13:14:44.487018 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:44.486987 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cfb58c84-fd7rt" event={"ID":"9241f232-bf65-44f0-8074-704d349dc8db","Type":"ContainerStarted","Data":"06763fadd18e4cf78c027b398b54511e7ae988190ba93f1fd54fead6accdc9c7"} Apr 16 13:14:51.508375 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:51.508340 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-t7qss" event={"ID":"ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e","Type":"ContainerStarted","Data":"f6b1bda9ec47060d04d689b5982427b83cdfb42fd9619163cf026daa571d0d2d"} Apr 16 13:14:51.508797 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:51.508485 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-t7qss" Apr 16 13:14:51.510092 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:51.510065 2574 patch_prober.go:28] interesting pod/downloads-586b57c7b4-t7qss container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.18:8080/\": dial tcp 10.134.0.18:8080: connect: connection refused" start-of-body= Apr 16 13:14:51.510207 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:51.510113 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-586b57c7b4-t7qss" podUID="ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.18:8080/\": dial tcp 10.134.0.18:8080: connect: connection refused" Apr 16 13:14:51.526524 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:51.526446 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-t7qss" podStartSLOduration=1.046899954 podStartE2EDuration="19.526427195s" podCreationTimestamp="2026-04-16 13:14:32 +0000 UTC" firstStartedPulling="2026-04-16 13:14:32.90605159 +0000 UTC m=+175.595983384" lastFinishedPulling="2026-04-16 13:14:51.385578832 +0000 UTC m=+194.075510625" observedRunningTime="2026-04-16 13:14:51.524533393 +0000 UTC m=+194.214465212" watchObservedRunningTime="2026-04-16 13:14:51.526427195 +0000 UTC m=+194.216359011" Apr 16 13:14:52.525803 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:52.525693 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-t7qss" Apr 16 13:14:53.753962 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.753798 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f8b46df5b-nj7bv"] Apr 16 13:14:53.791745 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.791703 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f8b46df5b-nj7bv"] Apr 16 13:14:53.791925 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.791792 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:53.802261 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.802235 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 13:14:53.927255 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.927214 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-oauth-serving-cert\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:53.927496 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.927288 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-config\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:53.927496 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.927319 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-oauth-config\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:53.927720 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.927688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-service-ca\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:53.927940 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.927740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-trusted-ca-bundle\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:53.927940 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.927769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49dl\" (UniqueName: \"kubernetes.io/projected/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-kube-api-access-k49dl\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:53.927940 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:53.927801 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-serving-cert\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.029399 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.029283 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-oauth-serving-cert\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.029399 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.029347 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-config\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.029399 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.029394 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-oauth-config\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.029721 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.029485 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-service-ca\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.029721 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.029521 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-trusted-ca-bundle\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.029721 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.029545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k49dl\" (UniqueName: \"kubernetes.io/projected/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-kube-api-access-k49dl\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.029721 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.029576 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-serving-cert\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.032298 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.032237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-config\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.034050 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.032816 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-service-ca\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.034050 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.033661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-trusted-ca-bundle\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.034349 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.034262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-oauth-serving-cert\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.035402 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.035357 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-oauth-config\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.035798 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.035772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-serving-cert\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.039639 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.039592 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49dl\" (UniqueName: \"kubernetes.io/projected/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-kube-api-access-k49dl\") pod \"console-5f8b46df5b-nj7bv\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.106755 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.106715 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:14:54.377750 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.377668 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f8b46df5b-nj7bv"] Apr 16 13:14:54.380935 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:14:54.380900 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cc78166_8f87_496d_a3a8_8a6aaf3c0173.slice/crio-7969f8364a6a97b650bb9b2fc846fb9d202221e5f1d20ec54b853358c1f5f32a WatchSource:0}: Error finding container 7969f8364a6a97b650bb9b2fc846fb9d202221e5f1d20ec54b853358c1f5f32a: Status 404 returned error can't find the container with id 7969f8364a6a97b650bb9b2fc846fb9d202221e5f1d20ec54b853358c1f5f32a Apr 16 13:14:54.521633 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:54.521590 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8b46df5b-nj7bv" event={"ID":"2cc78166-8f87-496d-a3a8-8a6aaf3c0173","Type":"ContainerStarted","Data":"7969f8364a6a97b650bb9b2fc846fb9d202221e5f1d20ec54b853358c1f5f32a"} Apr 16 13:14:55.526608 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:55.526559 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cfb58c84-fd7rt" event={"ID":"9241f232-bf65-44f0-8074-704d349dc8db","Type":"ContainerStarted","Data":"5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f"} Apr 16 13:14:55.528163 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:55.528138 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8b46df5b-nj7bv" event={"ID":"2cc78166-8f87-496d-a3a8-8a6aaf3c0173","Type":"ContainerStarted","Data":"0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e"} Apr 16 13:14:55.543298 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:55.543236 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5cfb58c84-fd7rt" podStartSLOduration=2.417892829 podStartE2EDuration="12.543216735s" podCreationTimestamp="2026-04-16 13:14:43 +0000 UTC" firstStartedPulling="2026-04-16 13:14:44.482976586 +0000 UTC m=+187.172908394" lastFinishedPulling="2026-04-16 13:14:54.60830049 +0000 UTC m=+197.298232300" observedRunningTime="2026-04-16 13:14:55.542728591 +0000 UTC m=+198.232660419" watchObservedRunningTime="2026-04-16 13:14:55.543216735 +0000 UTC m=+198.233148551" Apr 16 13:14:55.559509 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:55.559436 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f8b46df5b-nj7bv" podStartSLOduration=2.027330236 podStartE2EDuration="2.559418993s" podCreationTimestamp="2026-04-16 13:14:53 +0000 UTC" firstStartedPulling="2026-04-16 13:14:54.385578288 +0000 UTC m=+197.075510093" lastFinishedPulling="2026-04-16 13:14:54.917667054 +0000 UTC m=+197.607598850" observedRunningTime="2026-04-16 13:14:55.558142931 +0000 UTC m=+198.248074770" watchObservedRunningTime="2026-04-16 13:14:55.559418993 +0000 UTC m=+198.249350808" Apr 16 13:14:56.532717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:56.532664 2574 generic.go:358] "Generic (PLEG): container finished" podID="93de10c9-feba-4c72-8314-241a4ff3342c" containerID="c7c5f4b85252dd84a3bb28e98d177805d95179f1ab54661c9fa71d361b5f54ba" exitCode=0 Apr 16 13:14:56.533190 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:56.532734 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" event={"ID":"93de10c9-feba-4c72-8314-241a4ff3342c","Type":"ContainerDied","Data":"c7c5f4b85252dd84a3bb28e98d177805d95179f1ab54661c9fa71d361b5f54ba"} Apr 16 13:14:56.533287 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:56.533270 2574 scope.go:117] "RemoveContainer" containerID="c7c5f4b85252dd84a3bb28e98d177805d95179f1ab54661c9fa71d361b5f54ba" Apr 16 13:14:57.538735 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:14:57.538692 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-qp97p" event={"ID":"93de10c9-feba-4c72-8314-241a4ff3342c","Type":"ContainerStarted","Data":"56f6b41dca46aa60e082e846efecae67e1c91354e667a97d3814541f9b837b1f"} Apr 16 13:15:04.107444 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:04.107407 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:15:04.107444 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:04.107472 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:15:04.112358 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:04.112331 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:15:04.335805 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:04.335772 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:15:04.335805 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:04.335814 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:15:04.340670 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:04.340645 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:15:04.564199 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:04.564173 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:15:04.564726 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:04.564704 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:15:04.625618 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:04.625592 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cfb58c84-fd7rt"] Apr 16 13:15:16.603257 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:16.603221 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ea21a67-8607-4bf1-9fdc-dcdefcaf7121" containerID="25e6834e934182c618df41db98859dc12f88cc13a5ad41eadbd4613c338b790b" exitCode=0 Apr 16 13:15:16.603681 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:16.603280 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" event={"ID":"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121","Type":"ContainerDied","Data":"25e6834e934182c618df41db98859dc12f88cc13a5ad41eadbd4613c338b790b"} Apr 16 13:15:16.603681 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:16.603644 2574 scope.go:117] "RemoveContainer" containerID="25e6834e934182c618df41db98859dc12f88cc13a5ad41eadbd4613c338b790b" Apr 16 13:15:17.607210 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:17.607172 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-nth8m" event={"ID":"6ea21a67-8607-4bf1-9fdc-dcdefcaf7121","Type":"ContainerStarted","Data":"931c32f99a4c842d4679471ff38befa495cd1f10df5c14522da8781601e38b4c"} Apr 16 13:15:31.584439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.584376 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5cfb58c84-fd7rt" podUID="9241f232-bf65-44f0-8074-704d349dc8db" containerName="console" containerID="cri-o://5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f" gracePeriod=15 Apr 16 13:15:31.901064 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.901039 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cfb58c84-fd7rt_9241f232-bf65-44f0-8074-704d349dc8db/console/0.log" Apr 16 13:15:31.901186 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.901103 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:15:31.958430 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.958400 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8w4ph\" (UniqueName: \"kubernetes.io/projected/9241f232-bf65-44f0-8074-704d349dc8db-kube-api-access-8w4ph\") pod \"9241f232-bf65-44f0-8074-704d349dc8db\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " Apr 16 13:15:31.958617 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.958436 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-oauth-serving-cert\") pod \"9241f232-bf65-44f0-8074-704d349dc8db\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " Apr 16 13:15:31.958617 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.958480 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-service-ca\") pod \"9241f232-bf65-44f0-8074-704d349dc8db\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " Apr 16 13:15:31.958617 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.958553 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-console-config\") pod \"9241f232-bf65-44f0-8074-704d349dc8db\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " Apr 16 13:15:31.958617 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.958594 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-serving-cert\") pod \"9241f232-bf65-44f0-8074-704d349dc8db\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " Apr 16 13:15:31.958825 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.958644 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-oauth-config\") pod \"9241f232-bf65-44f0-8074-704d349dc8db\" (UID: \"9241f232-bf65-44f0-8074-704d349dc8db\") " Apr 16 13:15:31.959283 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.959248 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-service-ca" (OuterVolumeSpecName: "service-ca") pod "9241f232-bf65-44f0-8074-704d349dc8db" (UID: "9241f232-bf65-44f0-8074-704d349dc8db"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:31.959387 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.959320 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-console-config" (OuterVolumeSpecName: "console-config") pod "9241f232-bf65-44f0-8074-704d349dc8db" (UID: "9241f232-bf65-44f0-8074-704d349dc8db"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:31.966015 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.959278 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9241f232-bf65-44f0-8074-704d349dc8db" (UID: "9241f232-bf65-44f0-8074-704d349dc8db"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:31.966015 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.962325 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9241f232-bf65-44f0-8074-704d349dc8db" (UID: "9241f232-bf65-44f0-8074-704d349dc8db"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:31.966015 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.963629 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9241f232-bf65-44f0-8074-704d349dc8db" (UID: "9241f232-bf65-44f0-8074-704d349dc8db"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:31.967882 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:31.967861 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9241f232-bf65-44f0-8074-704d349dc8db-kube-api-access-8w4ph" (OuterVolumeSpecName: "kube-api-access-8w4ph") pod "9241f232-bf65-44f0-8074-704d349dc8db" (UID: "9241f232-bf65-44f0-8074-704d349dc8db"). InnerVolumeSpecName "kube-api-access-8w4ph". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:15:32.060105 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.060064 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-serving-cert\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:32.060105 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.060098 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9241f232-bf65-44f0-8074-704d349dc8db-console-oauth-config\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:32.060105 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.060112 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8w4ph\" (UniqueName: \"kubernetes.io/projected/9241f232-bf65-44f0-8074-704d349dc8db-kube-api-access-8w4ph\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:32.060340 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.060125 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-oauth-serving-cert\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:32.060340 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.060139 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-service-ca\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:32.060340 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.060152 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9241f232-bf65-44f0-8074-704d349dc8db-console-config\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:32.650907 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.650873 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5cfb58c84-fd7rt_9241f232-bf65-44f0-8074-704d349dc8db/console/0.log" Apr 16 13:15:32.651374 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.650919 2574 generic.go:358] "Generic (PLEG): container finished" podID="9241f232-bf65-44f0-8074-704d349dc8db" containerID="5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f" exitCode=2 Apr 16 13:15:32.651374 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.650990 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cfb58c84-fd7rt" event={"ID":"9241f232-bf65-44f0-8074-704d349dc8db","Type":"ContainerDied","Data":"5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f"} Apr 16 13:15:32.651374 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.651003 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5cfb58c84-fd7rt" Apr 16 13:15:32.651374 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.651030 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5cfb58c84-fd7rt" event={"ID":"9241f232-bf65-44f0-8074-704d349dc8db","Type":"ContainerDied","Data":"06763fadd18e4cf78c027b398b54511e7ae988190ba93f1fd54fead6accdc9c7"} Apr 16 13:15:32.651374 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.651053 2574 scope.go:117] "RemoveContainer" containerID="5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f" Apr 16 13:15:32.659193 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.659175 2574 scope.go:117] "RemoveContainer" containerID="5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f" Apr 16 13:15:32.659547 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:15:32.659440 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f\": container with ID starting with 5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f not found: ID does not exist" containerID="5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f" Apr 16 13:15:32.659547 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.659481 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f"} err="failed to get container status \"5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f\": rpc error: code = NotFound desc = could not find container \"5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f\": container with ID starting with 5a9983cf4b063f161b207dfdbef9ec01b47d5976f486bcbeb548d02bfb1f8c1f not found: ID does not exist" Apr 16 13:15:32.674549 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.674516 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5cfb58c84-fd7rt"] Apr 16 13:15:32.682072 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:32.682043 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5cfb58c84-fd7rt"] Apr 16 13:15:33.882263 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:33.882219 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9241f232-bf65-44f0-8074-704d349dc8db" path="/var/lib/kubelet/pods/9241f232-bf65-44f0-8074-704d349dc8db/volumes" Apr 16 13:15:49.605996 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:49.605896 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:15:49.608240 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:49.608206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aca4cb2c-c97a-48d4-a12d-3316ad904fc3-metrics-certs\") pod \"network-metrics-daemon-shf7m\" (UID: \"aca4cb2c-c97a-48d4-a12d-3316ad904fc3\") " pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:15:49.783262 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:49.783230 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hkf74\"" Apr 16 13:15:49.791024 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:49.790986 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-shf7m" Apr 16 13:15:49.913133 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:49.913108 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-shf7m"] Apr 16 13:15:49.915886 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:15:49.915855 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaca4cb2c_c97a_48d4_a12d_3316ad904fc3.slice/crio-a41ef209d912ad388d701d089620bca5acb0af635c3ad87e6c24d75fc8ac5c70 WatchSource:0}: Error finding container a41ef209d912ad388d701d089620bca5acb0af635c3ad87e6c24d75fc8ac5c70: Status 404 returned error can't find the container with id a41ef209d912ad388d701d089620bca5acb0af635c3ad87e6c24d75fc8ac5c70 Apr 16 13:15:50.705888 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:50.705841 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-shf7m" event={"ID":"aca4cb2c-c97a-48d4-a12d-3316ad904fc3","Type":"ContainerStarted","Data":"a41ef209d912ad388d701d089620bca5acb0af635c3ad87e6c24d75fc8ac5c70"} Apr 16 13:15:50.873214 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:50.873179 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 13:15:50.873957 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:50.873898 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="alertmanager" containerID="cri-o://caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd" gracePeriod=120 Apr 16 13:15:50.874067 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:50.873953 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy-metric" containerID="cri-o://a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24" gracePeriod=120 Apr 16 13:15:50.874067 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:50.874004 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="config-reloader" containerID="cri-o://523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596" gracePeriod=120 Apr 16 13:15:50.874067 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:50.873972 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy-web" containerID="cri-o://b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769" gracePeriod=120 Apr 16 13:15:50.874067 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:50.874013 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy" containerID="cri-o://5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b" gracePeriod=120 Apr 16 13:15:50.874306 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:50.873987 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="prom-label-proxy" containerID="cri-o://01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13" gracePeriod=120 Apr 16 13:15:51.367317 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.367221 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7db7bfdf6f-5p5zn"] Apr 16 13:15:51.367765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.367747 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9241f232-bf65-44f0-8074-704d349dc8db" containerName="console" Apr 16 13:15:51.367849 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.367767 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9241f232-bf65-44f0-8074-704d349dc8db" containerName="console" Apr 16 13:15:51.367849 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.367839 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9241f232-bf65-44f0-8074-704d349dc8db" containerName="console" Apr 16 13:15:51.371031 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.371006 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7db7bfdf6f-5p5zn"] Apr 16 13:15:51.371160 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.371134 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.521016 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.520984 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-config\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.521189 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.521024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-service-ca\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.521189 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.521088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-oauth-config\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.521189 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.521132 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-trusted-ca-bundle\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.521189 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.521154 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-serving-cert\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.521189 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.521178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-oauth-serving-cert\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.521351 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.521203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp9xq\" (UniqueName: \"kubernetes.io/projected/f22d56e1-1f03-4547-a594-9ae793ac47c4-kube-api-access-jp9xq\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.622678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.622575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-oauth-config\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.622678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.622636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-trusted-ca-bundle\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.622678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.622668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-serving-cert\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.622951 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.622699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-oauth-serving-cert\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.622951 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.622724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp9xq\" (UniqueName: \"kubernetes.io/projected/f22d56e1-1f03-4547-a594-9ae793ac47c4-kube-api-access-jp9xq\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.622951 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.622760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-config\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.622951 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.622879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-service-ca\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.623558 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.623536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-config\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.623558 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.623545 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-trusted-ca-bundle\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.623723 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.623556 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-oauth-serving-cert\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.623723 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.623658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-service-ca\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.625304 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.625277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-serving-cert\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.625394 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.625384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-oauth-config\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.631678 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.631653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp9xq\" (UniqueName: \"kubernetes.io/projected/f22d56e1-1f03-4547-a594-9ae793ac47c4-kube-api-access-jp9xq\") pod \"console-7db7bfdf6f-5p5zn\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.681594 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.681557 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:15:51.711491 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.711438 2574 generic.go:358] "Generic (PLEG): container finished" podID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerID="01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13" exitCode=0 Apr 16 13:15:51.711491 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.711487 2574 generic.go:358] "Generic (PLEG): container finished" podID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerID="5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b" exitCode=0 Apr 16 13:15:51.711491 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.711496 2574 generic.go:358] "Generic (PLEG): container finished" podID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerID="523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596" exitCode=0 Apr 16 13:15:51.711925 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.711505 2574 generic.go:358] "Generic (PLEG): container finished" podID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerID="caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd" exitCode=0 Apr 16 13:15:51.711925 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.711492 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerDied","Data":"01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13"} Apr 16 13:15:51.711925 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.711575 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerDied","Data":"5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b"} Apr 16 13:15:51.711925 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.711592 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerDied","Data":"523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596"} Apr 16 13:15:51.711925 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.711601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerDied","Data":"caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd"} Apr 16 13:15:51.713314 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.713289 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-shf7m" event={"ID":"aca4cb2c-c97a-48d4-a12d-3316ad904fc3","Type":"ContainerStarted","Data":"8ab4710ebd1819f60a5ec21aeff62ceb764dcde1b61c86583c7a907a90164f46"} Apr 16 13:15:51.713314 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.713318 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-shf7m" event={"ID":"aca4cb2c-c97a-48d4-a12d-3316ad904fc3","Type":"ContainerStarted","Data":"42c3ba5abd32cf12266af7f275ce783015d6b5318bcfb17a1d201f72b881da83"} Apr 16 13:15:51.732175 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.731862 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-shf7m" podStartSLOduration=253.794960705 podStartE2EDuration="4m14.731839614s" podCreationTimestamp="2026-04-16 13:11:37 +0000 UTC" firstStartedPulling="2026-04-16 13:15:49.917940519 +0000 UTC m=+252.607872313" lastFinishedPulling="2026-04-16 13:15:50.854819411 +0000 UTC m=+253.544751222" observedRunningTime="2026-04-16 13:15:51.729657546 +0000 UTC m=+254.419589363" watchObservedRunningTime="2026-04-16 13:15:51.731839614 +0000 UTC m=+254.421771430" Apr 16 13:15:51.818224 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:51.818197 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7db7bfdf6f-5p5zn"] Apr 16 13:15:51.820213 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:15:51.820184 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf22d56e1_1f03_4547_a594_9ae793ac47c4.slice/crio-b0a168e683f5ca9dfef9476ce1c7071de74b6aa98722234a45b55d2d37f24aea WatchSource:0}: Error finding container b0a168e683f5ca9dfef9476ce1c7071de74b6aa98722234a45b55d2d37f24aea: Status 404 returned error can't find the container with id b0a168e683f5ca9dfef9476ce1c7071de74b6aa98722234a45b55d2d37f24aea Apr 16 13:15:52.112369 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.112345 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.229257 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229219 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-web-config\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229266 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-cluster-tls-config\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229301 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-trusted-ca-bundle\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229337 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmvz9\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-kube-api-access-hmvz9\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229366 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-main-tls\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229398 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-main-db\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229436 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-tls-assets\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229764 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229483 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-metric\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229822 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229768 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-out\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229822 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229770 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:15:52.229822 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229811 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-web\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229958 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229851 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-volume\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.229958 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229884 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.230057 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.229954 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-metrics-client-ca\") pod \"20f015bb-b55d-4cf1-b343-e98727d59a5b\" (UID: \"20f015bb-b55d-4cf1-b343-e98727d59a5b\") " Apr 16 13:15:52.231055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.230130 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:52.231055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.230238 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.231055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.230259 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-alertmanager-main-db\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.231055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.230633 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:52.232361 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.232335 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-kube-api-access-hmvz9" (OuterVolumeSpecName: "kube-api-access-hmvz9") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "kube-api-access-hmvz9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:15:52.232683 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.232657 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:52.232786 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.232746 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:52.233094 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.232969 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-out" (OuterVolumeSpecName: "config-out") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:15:52.233094 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.233061 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:52.233765 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.233741 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:15:52.233858 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.233810 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-volume" (OuterVolumeSpecName: "config-volume") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:52.233904 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.233883 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:52.237716 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.237697 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:52.244092 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.244065 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-web-config" (OuterVolumeSpecName: "web-config") pod "20f015bb-b55d-4cf1-b343-e98727d59a5b" (UID: "20f015bb-b55d-4cf1-b343-e98727d59a5b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:52.331399 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331355 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-web-config\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331399 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331392 2574 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-cluster-tls-config\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331399 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331404 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hmvz9\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-kube-api-access-hmvz9\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331414 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-main-tls\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331425 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/20f015bb-b55d-4cf1-b343-e98727d59a5b-tls-assets\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331435 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331447 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-out\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331486 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331496 2574 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-config-volume\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331504 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20f015bb-b55d-4cf1-b343-e98727d59a5b-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.331717 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.331514 2574 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20f015bb-b55d-4cf1-b343-e98727d59a5b-metrics-client-ca\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:15:52.719245 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.719212 2574 generic.go:358] "Generic (PLEG): container finished" podID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerID="a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24" exitCode=0 Apr 16 13:15:52.719245 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.719240 2574 generic.go:358] "Generic (PLEG): container finished" podID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerID="b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769" exitCode=0 Apr 16 13:15:52.719778 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.719299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerDied","Data":"a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24"} Apr 16 13:15:52.719778 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.719325 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.719778 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.719345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerDied","Data":"b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769"} Apr 16 13:15:52.719778 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.719357 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"20f015bb-b55d-4cf1-b343-e98727d59a5b","Type":"ContainerDied","Data":"a4ecc909b5f84cd3244c4a7c424fcf1c5395335f3327a52e8051a7dc0a627c52"} Apr 16 13:15:52.719778 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.719373 2574 scope.go:117] "RemoveContainer" containerID="01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13" Apr 16 13:15:52.720992 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.720968 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db7bfdf6f-5p5zn" event={"ID":"f22d56e1-1f03-4547-a594-9ae793ac47c4","Type":"ContainerStarted","Data":"94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f"} Apr 16 13:15:52.721104 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.721001 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db7bfdf6f-5p5zn" event={"ID":"f22d56e1-1f03-4547-a594-9ae793ac47c4","Type":"ContainerStarted","Data":"b0a168e683f5ca9dfef9476ce1c7071de74b6aa98722234a45b55d2d37f24aea"} Apr 16 13:15:52.727396 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.727302 2574 scope.go:117] "RemoveContainer" containerID="a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24" Apr 16 13:15:52.734269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.734248 2574 scope.go:117] "RemoveContainer" containerID="5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b" Apr 16 13:15:52.741155 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.741135 2574 scope.go:117] "RemoveContainer" containerID="b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769" Apr 16 13:15:52.743435 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.743066 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7db7bfdf6f-5p5zn" podStartSLOduration=1.74304739 podStartE2EDuration="1.74304739s" podCreationTimestamp="2026-04-16 13:15:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:15:52.742379872 +0000 UTC m=+255.432311688" watchObservedRunningTime="2026-04-16 13:15:52.74304739 +0000 UTC m=+255.432979206" Apr 16 13:15:52.748415 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.748396 2574 scope.go:117] "RemoveContainer" containerID="523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596" Apr 16 13:15:52.755766 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.755741 2574 scope.go:117] "RemoveContainer" containerID="caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd" Apr 16 13:15:52.757153 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.757134 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 13:15:52.760711 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.760691 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 13:15:52.763919 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.763904 2574 scope.go:117] "RemoveContainer" containerID="39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180" Apr 16 13:15:52.771002 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.770982 2574 scope.go:117] "RemoveContainer" containerID="01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13" Apr 16 13:15:52.771278 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:15:52.771259 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13\": container with ID starting with 01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13 not found: ID does not exist" containerID="01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13" Apr 16 13:15:52.771330 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.771288 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13"} err="failed to get container status \"01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13\": rpc error: code = NotFound desc = could not find container \"01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13\": container with ID starting with 01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13 not found: ID does not exist" Apr 16 13:15:52.771330 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.771308 2574 scope.go:117] "RemoveContainer" containerID="a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24" Apr 16 13:15:52.771602 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:15:52.771583 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24\": container with ID starting with a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24 not found: ID does not exist" containerID="a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24" Apr 16 13:15:52.771655 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.771608 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24"} err="failed to get container status \"a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24\": rpc error: code = NotFound desc = could not find container \"a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24\": container with ID starting with a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24 not found: ID does not exist" Apr 16 13:15:52.771655 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.771625 2574 scope.go:117] "RemoveContainer" containerID="5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b" Apr 16 13:15:52.771878 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:15:52.771859 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b\": container with ID starting with 5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b not found: ID does not exist" containerID="5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b" Apr 16 13:15:52.771921 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.771883 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b"} err="failed to get container status \"5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b\": rpc error: code = NotFound desc = could not find container \"5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b\": container with ID starting with 5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b not found: ID does not exist" Apr 16 13:15:52.771921 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.771895 2574 scope.go:117] "RemoveContainer" containerID="b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769" Apr 16 13:15:52.772105 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:15:52.772090 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769\": container with ID starting with b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769 not found: ID does not exist" containerID="b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769" Apr 16 13:15:52.772145 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.772110 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769"} err="failed to get container status \"b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769\": rpc error: code = NotFound desc = could not find container \"b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769\": container with ID starting with b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769 not found: ID does not exist" Apr 16 13:15:52.772145 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.772138 2574 scope.go:117] "RemoveContainer" containerID="523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596" Apr 16 13:15:52.772339 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:15:52.772324 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596\": container with ID starting with 523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596 not found: ID does not exist" containerID="523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596" Apr 16 13:15:52.772380 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.772343 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596"} err="failed to get container status \"523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596\": rpc error: code = NotFound desc = could not find container \"523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596\": container with ID starting with 523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596 not found: ID does not exist" Apr 16 13:15:52.772380 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.772355 2574 scope.go:117] "RemoveContainer" containerID="caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd" Apr 16 13:15:52.772571 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:15:52.772556 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd\": container with ID starting with caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd not found: ID does not exist" containerID="caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd" Apr 16 13:15:52.772644 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.772577 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd"} err="failed to get container status \"caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd\": rpc error: code = NotFound desc = could not find container \"caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd\": container with ID starting with caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd not found: ID does not exist" Apr 16 13:15:52.772644 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.772595 2574 scope.go:117] "RemoveContainer" containerID="39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180" Apr 16 13:15:52.772829 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:15:52.772813 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180\": container with ID starting with 39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180 not found: ID does not exist" containerID="39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180" Apr 16 13:15:52.772870 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.772834 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180"} err="failed to get container status \"39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180\": rpc error: code = NotFound desc = could not find container \"39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180\": container with ID starting with 39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180 not found: ID does not exist" Apr 16 13:15:52.772870 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.772853 2574 scope.go:117] "RemoveContainer" containerID="01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13" Apr 16 13:15:52.773083 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773062 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13"} err="failed to get container status \"01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13\": rpc error: code = NotFound desc = could not find container \"01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13\": container with ID starting with 01bfc276f8e65e43afaa501e9d6cd4b91b481b5c6539e046d3860d06da1c4b13 not found: ID does not exist" Apr 16 13:15:52.773083 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773078 2574 scope.go:117] "RemoveContainer" containerID="a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24" Apr 16 13:15:52.773264 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773250 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24"} err="failed to get container status \"a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24\": rpc error: code = NotFound desc = could not find container \"a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24\": container with ID starting with a9ff28ed740399d7b7e92010b68453c9ba447b6ef9dcfd96863daa9f9e7d8b24 not found: ID does not exist" Apr 16 13:15:52.773304 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773267 2574 scope.go:117] "RemoveContainer" containerID="5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b" Apr 16 13:15:52.773495 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773443 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b"} err="failed to get container status \"5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b\": rpc error: code = NotFound desc = could not find container \"5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b\": container with ID starting with 5b1c3a7ea004046f1b18bd76eda3338f30262551f6167d5601a45cd217cb059b not found: ID does not exist" Apr 16 13:15:52.773495 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773472 2574 scope.go:117] "RemoveContainer" containerID="b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769" Apr 16 13:15:52.773651 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773635 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769"} err="failed to get container status \"b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769\": rpc error: code = NotFound desc = could not find container \"b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769\": container with ID starting with b5c5ac4df9e900909fcb8cc0d246c21375baef25dad4fefce5c88e793cde6769 not found: ID does not exist" Apr 16 13:15:52.773697 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773651 2574 scope.go:117] "RemoveContainer" containerID="523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596" Apr 16 13:15:52.773872 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773841 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596"} err="failed to get container status \"523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596\": rpc error: code = NotFound desc = could not find container \"523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596\": container with ID starting with 523c6fb05fa118c35c971c0b1f254245ba96fb145eea6ad81190fae0ef393596 not found: ID does not exist" Apr 16 13:15:52.773955 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.773874 2574 scope.go:117] "RemoveContainer" containerID="caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd" Apr 16 13:15:52.774144 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.774128 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd"} err="failed to get container status \"caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd\": rpc error: code = NotFound desc = could not find container \"caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd\": container with ID starting with caad5c3103503b3dcae12d8692cdb392306321875e8311479c6aac4c1cee9ebd not found: ID does not exist" Apr 16 13:15:52.774190 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.774145 2574 scope.go:117] "RemoveContainer" containerID="39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180" Apr 16 13:15:52.774476 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.774387 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180"} err="failed to get container status \"39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180\": rpc error: code = NotFound desc = could not find container \"39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180\": container with ID starting with 39449cd867f68a3df84128ecd3dc62274772344d08cd370cdf0d3bb106ffc180 not found: ID does not exist" Apr 16 13:15:52.783815 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.783789 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 13:15:52.784151 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784136 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy" Apr 16 13:15:52.784218 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784153 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy" Apr 16 13:15:52.784218 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784170 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy-web" Apr 16 13:15:52.784218 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784179 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy-web" Apr 16 13:15:52.784218 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784192 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="config-reloader" Apr 16 13:15:52.784218 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784200 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="config-reloader" Apr 16 13:15:52.784218 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784214 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy-metric" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784222 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy-metric" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784240 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="prom-label-proxy" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784249 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="prom-label-proxy" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784264 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="init-config-reloader" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784272 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="init-config-reloader" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784283 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="alertmanager" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784293 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="alertmanager" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784363 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy-metric" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784379 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="alertmanager" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784389 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy-web" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784399 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="config-reloader" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784410 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="kube-rbac-proxy" Apr 16 13:15:52.784535 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.784423 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" containerName="prom-label-proxy" Apr 16 13:15:52.789989 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.789968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.792913 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.792894 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-7dhhl\"" Apr 16 13:15:52.793025 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.792912 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 13:15:52.793109 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.793092 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 13:15:52.793165 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.793111 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 13:15:52.793165 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.793101 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 13:15:52.793397 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.793380 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 13:15:52.793467 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.793435 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 13:15:52.793548 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.793531 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 13:15:52.793824 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.793807 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 13:15:52.798132 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.798116 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 13:15:52.807611 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.807587 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 13:15:52.936819 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.936767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.936819 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.936819 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.936896 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/40fdff14-28a4-49a1-bd45-05f59bae4cbf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.936933 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-config-volume\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.936975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-web-config\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.937029 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937206 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.937073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40fdff14-28a4-49a1-bd45-05f59bae4cbf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937206 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.937111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937206 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.937128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937206 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.937157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxh9x\" (UniqueName: \"kubernetes.io/projected/40fdff14-28a4-49a1-bd45-05f59bae4cbf-kube-api-access-cxh9x\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937353 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.937220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40fdff14-28a4-49a1-bd45-05f59bae4cbf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937353 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.937239 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40fdff14-28a4-49a1-bd45-05f59bae4cbf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:52.937353 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:52.937281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40fdff14-28a4-49a1-bd45-05f59bae4cbf-config-out\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038190 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40fdff14-28a4-49a1-bd45-05f59bae4cbf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038190 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038148 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40fdff14-28a4-49a1-bd45-05f59bae4cbf-config-out\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038190 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038444 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038279 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038444 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/40fdff14-28a4-49a1-bd45-05f59bae4cbf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038444 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038329 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-config-volume\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038444 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038369 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-web-config\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038660 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038496 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038660 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40fdff14-28a4-49a1-bd45-05f59bae4cbf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038660 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038621 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038660 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038859 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxh9x\" (UniqueName: \"kubernetes.io/projected/40fdff14-28a4-49a1-bd45-05f59bae4cbf-kube-api-access-cxh9x\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038859 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038705 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/40fdff14-28a4-49a1-bd45-05f59bae4cbf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.038859 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40fdff14-28a4-49a1-bd45-05f59bae4cbf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.039032 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.038923 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40fdff14-28a4-49a1-bd45-05f59bae4cbf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.039632 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.039602 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40fdff14-28a4-49a1-bd45-05f59bae4cbf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.041540 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.041516 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-web-config\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.041540 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.041545 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40fdff14-28a4-49a1-bd45-05f59bae4cbf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.041759 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.041586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.041759 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.041739 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-config-volume\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.041928 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.041792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.042001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.041954 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40fdff14-28a4-49a1-bd45-05f59bae4cbf-config-out\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.042130 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.042110 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.042937 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.042915 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.044127 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.044106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/40fdff14-28a4-49a1-bd45-05f59bae4cbf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.050981 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.050952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxh9x\" (UniqueName: \"kubernetes.io/projected/40fdff14-28a4-49a1-bd45-05f59bae4cbf-kube-api-access-cxh9x\") pod \"alertmanager-main-0\" (UID: \"40fdff14-28a4-49a1-bd45-05f59bae4cbf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.099665 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.099636 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 13:15:53.225615 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.225580 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 13:15:53.247192 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:15:53.241731 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40fdff14_28a4_49a1_bd45_05f59bae4cbf.slice/crio-c0624423575e799959e87632c80557f133fcc7cbf9581b88ab6a58483fd514ca WatchSource:0}: Error finding container c0624423575e799959e87632c80557f133fcc7cbf9581b88ab6a58483fd514ca: Status 404 returned error can't find the container with id c0624423575e799959e87632c80557f133fcc7cbf9581b88ab6a58483fd514ca Apr 16 13:15:53.731327 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.731287 2574 generic.go:358] "Generic (PLEG): container finished" podID="40fdff14-28a4-49a1-bd45-05f59bae4cbf" containerID="024368e54f1bb35c0d502ae09dfd196e2d729a24c5843136b3bba64cbb489a9c" exitCode=0 Apr 16 13:15:53.731774 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.731368 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"40fdff14-28a4-49a1-bd45-05f59bae4cbf","Type":"ContainerDied","Data":"024368e54f1bb35c0d502ae09dfd196e2d729a24c5843136b3bba64cbb489a9c"} Apr 16 13:15:53.731774 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.731402 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"40fdff14-28a4-49a1-bd45-05f59bae4cbf","Type":"ContainerStarted","Data":"c0624423575e799959e87632c80557f133fcc7cbf9581b88ab6a58483fd514ca"} Apr 16 13:15:53.883306 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:53.883278 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20f015bb-b55d-4cf1-b343-e98727d59a5b" path="/var/lib/kubelet/pods/20f015bb-b55d-4cf1-b343-e98727d59a5b/volumes" Apr 16 13:15:54.737203 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:54.737163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"40fdff14-28a4-49a1-bd45-05f59bae4cbf","Type":"ContainerStarted","Data":"a1786cf0d85ca480610c5284d51ee1c9d6d34d226534bfd43f1b7dcb2b5953e5"} Apr 16 13:15:54.737203 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:54.737207 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"40fdff14-28a4-49a1-bd45-05f59bae4cbf","Type":"ContainerStarted","Data":"b76843c8e928c937148c27007fac4c29cc8579d1812e08c86f818431645c46e7"} Apr 16 13:15:54.737640 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:54.737220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"40fdff14-28a4-49a1-bd45-05f59bae4cbf","Type":"ContainerStarted","Data":"265bb626ef631dba864a60e11867fde94a4ae51236952c6eecf86f9e275d2682"} Apr 16 13:15:54.737640 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:54.737233 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"40fdff14-28a4-49a1-bd45-05f59bae4cbf","Type":"ContainerStarted","Data":"a78d33480221045c39c49a1b9aa46cd263b9b70ebd0c7b95b6f5f8687821a110"} Apr 16 13:15:54.737640 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:54.737246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"40fdff14-28a4-49a1-bd45-05f59bae4cbf","Type":"ContainerStarted","Data":"a475f788d0b1af8f348d38bf287416667b03d063592cf9a67ea269af42948948"} Apr 16 13:15:54.737640 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:54.737257 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"40fdff14-28a4-49a1-bd45-05f59bae4cbf","Type":"ContainerStarted","Data":"4008eff84cb07eb2ae016b294845d6ac9793bf65e7147da11246113aaf1aa155"} Apr 16 13:15:54.766217 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:15:54.766164 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.766148934 podStartE2EDuration="2.766148934s" podCreationTimestamp="2026-04-16 13:15:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:15:54.764731383 +0000 UTC m=+257.454663199" watchObservedRunningTime="2026-04-16 13:15:54.766148934 +0000 UTC m=+257.456080748" Apr 16 13:16:01.681733 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:01.681688 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:16:01.682127 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:01.681747 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:16:01.686400 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:01.686374 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:16:01.761598 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:01.761571 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:16:01.806699 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:01.806660 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f8b46df5b-nj7bv"] Apr 16 13:16:12.425123 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.425085 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-d8x8s"] Apr 16 13:16:12.429713 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.429693 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.432618 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.432586 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:16:12.437788 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.437762 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d8x8s"] Apr 16 13:16:12.500065 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.500021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a67fb660-e171-411d-96d9-bf4f1692c5d7-original-pull-secret\") pod \"global-pull-secret-syncer-d8x8s\" (UID: \"a67fb660-e171-411d-96d9-bf4f1692c5d7\") " pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.500065 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.500073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a67fb660-e171-411d-96d9-bf4f1692c5d7-dbus\") pod \"global-pull-secret-syncer-d8x8s\" (UID: \"a67fb660-e171-411d-96d9-bf4f1692c5d7\") " pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.500269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.500150 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a67fb660-e171-411d-96d9-bf4f1692c5d7-kubelet-config\") pod \"global-pull-secret-syncer-d8x8s\" (UID: \"a67fb660-e171-411d-96d9-bf4f1692c5d7\") " pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.600588 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.600544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a67fb660-e171-411d-96d9-bf4f1692c5d7-kubelet-config\") pod \"global-pull-secret-syncer-d8x8s\" (UID: \"a67fb660-e171-411d-96d9-bf4f1692c5d7\") " pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.600769 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.600612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a67fb660-e171-411d-96d9-bf4f1692c5d7-original-pull-secret\") pod \"global-pull-secret-syncer-d8x8s\" (UID: \"a67fb660-e171-411d-96d9-bf4f1692c5d7\") " pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.600769 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.600644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a67fb660-e171-411d-96d9-bf4f1692c5d7-dbus\") pod \"global-pull-secret-syncer-d8x8s\" (UID: \"a67fb660-e171-411d-96d9-bf4f1692c5d7\") " pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.600769 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.600684 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a67fb660-e171-411d-96d9-bf4f1692c5d7-kubelet-config\") pod \"global-pull-secret-syncer-d8x8s\" (UID: \"a67fb660-e171-411d-96d9-bf4f1692c5d7\") " pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.600884 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.600805 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a67fb660-e171-411d-96d9-bf4f1692c5d7-dbus\") pod \"global-pull-secret-syncer-d8x8s\" (UID: \"a67fb660-e171-411d-96d9-bf4f1692c5d7\") " pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.603036 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.603013 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a67fb660-e171-411d-96d9-bf4f1692c5d7-original-pull-secret\") pod \"global-pull-secret-syncer-d8x8s\" (UID: \"a67fb660-e171-411d-96d9-bf4f1692c5d7\") " pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.740526 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.740404 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d8x8s" Apr 16 13:16:12.858559 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:12.858437 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d8x8s"] Apr 16 13:16:12.861350 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:16:12.861316 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda67fb660_e171_411d_96d9_bf4f1692c5d7.slice/crio-717ea9345f83bf7728fa21f123b2d177bdcd95f44682d1e677bacd6646b4ac64 WatchSource:0}: Error finding container 717ea9345f83bf7728fa21f123b2d177bdcd95f44682d1e677bacd6646b4ac64: Status 404 returned error can't find the container with id 717ea9345f83bf7728fa21f123b2d177bdcd95f44682d1e677bacd6646b4ac64 Apr 16 13:16:13.796837 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:13.796796 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d8x8s" event={"ID":"a67fb660-e171-411d-96d9-bf4f1692c5d7","Type":"ContainerStarted","Data":"717ea9345f83bf7728fa21f123b2d177bdcd95f44682d1e677bacd6646b4ac64"} Apr 16 13:16:18.814711 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:18.814081 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d8x8s" event={"ID":"a67fb660-e171-411d-96d9-bf4f1692c5d7","Type":"ContainerStarted","Data":"4a88bc9df65b0d70c927988ba596982d27bc24129412438b2ee5bbcbf441789d"} Apr 16 13:16:18.830382 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:18.830325 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-d8x8s" podStartSLOduration=1.080935921 podStartE2EDuration="6.830309048s" podCreationTimestamp="2026-04-16 13:16:12 +0000 UTC" firstStartedPulling="2026-04-16 13:16:12.86300027 +0000 UTC m=+275.552932062" lastFinishedPulling="2026-04-16 13:16:18.612373383 +0000 UTC m=+281.302305189" observedRunningTime="2026-04-16 13:16:18.829482674 +0000 UTC m=+281.519414488" watchObservedRunningTime="2026-04-16 13:16:18.830309048 +0000 UTC m=+281.520240856" Apr 16 13:16:26.829160 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:26.829100 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f8b46df5b-nj7bv" podUID="2cc78166-8f87-496d-a3a8-8a6aaf3c0173" containerName="console" containerID="cri-o://0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e" gracePeriod=15 Apr 16 13:16:27.063684 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.063662 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f8b46df5b-nj7bv_2cc78166-8f87-496d-a3a8-8a6aaf3c0173/console/0.log" Apr 16 13:16:27.063815 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.063722 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:16:27.124659 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.124566 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k49dl\" (UniqueName: \"kubernetes.io/projected/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-kube-api-access-k49dl\") pod \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " Apr 16 13:16:27.124659 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.124626 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-oauth-serving-cert\") pod \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " Apr 16 13:16:27.124659 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.124655 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-config\") pod \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " Apr 16 13:16:27.124926 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.124671 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-trusted-ca-bundle\") pod \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " Apr 16 13:16:27.124926 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.124693 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-oauth-config\") pod \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " Apr 16 13:16:27.124926 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.124734 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-service-ca\") pod \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " Apr 16 13:16:27.124926 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.124778 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-serving-cert\") pod \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\" (UID: \"2cc78166-8f87-496d-a3a8-8a6aaf3c0173\") " Apr 16 13:16:27.125201 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.125157 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2cc78166-8f87-496d-a3a8-8a6aaf3c0173" (UID: "2cc78166-8f87-496d-a3a8-8a6aaf3c0173"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:16:27.125201 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.125168 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-config" (OuterVolumeSpecName: "console-config") pod "2cc78166-8f87-496d-a3a8-8a6aaf3c0173" (UID: "2cc78166-8f87-496d-a3a8-8a6aaf3c0173"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:16:27.125201 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.125178 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2cc78166-8f87-496d-a3a8-8a6aaf3c0173" (UID: "2cc78166-8f87-496d-a3a8-8a6aaf3c0173"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:16:27.125371 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.125207 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-service-ca" (OuterVolumeSpecName: "service-ca") pod "2cc78166-8f87-496d-a3a8-8a6aaf3c0173" (UID: "2cc78166-8f87-496d-a3a8-8a6aaf3c0173"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:16:27.126993 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.126957 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-kube-api-access-k49dl" (OuterVolumeSpecName: "kube-api-access-k49dl") pod "2cc78166-8f87-496d-a3a8-8a6aaf3c0173" (UID: "2cc78166-8f87-496d-a3a8-8a6aaf3c0173"). InnerVolumeSpecName "kube-api-access-k49dl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:16:27.126993 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.126965 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2cc78166-8f87-496d-a3a8-8a6aaf3c0173" (UID: "2cc78166-8f87-496d-a3a8-8a6aaf3c0173"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:16:27.127112 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.127041 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2cc78166-8f87-496d-a3a8-8a6aaf3c0173" (UID: "2cc78166-8f87-496d-a3a8-8a6aaf3c0173"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:16:27.225619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.225575 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k49dl\" (UniqueName: \"kubernetes.io/projected/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-kube-api-access-k49dl\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:27.225619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.225612 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-oauth-serving-cert\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:27.225619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.225628 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-config\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:27.225862 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.225640 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-trusted-ca-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:27.225862 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.225652 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-oauth-config\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:27.225862 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.225664 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-service-ca\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:27.225862 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.225677 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2cc78166-8f87-496d-a3a8-8a6aaf3c0173-console-serving-cert\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:27.844149 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.844121 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f8b46df5b-nj7bv_2cc78166-8f87-496d-a3a8-8a6aaf3c0173/console/0.log" Apr 16 13:16:27.844654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.844162 2574 generic.go:358] "Generic (PLEG): container finished" podID="2cc78166-8f87-496d-a3a8-8a6aaf3c0173" containerID="0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e" exitCode=2 Apr 16 13:16:27.844654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.844222 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8b46df5b-nj7bv" event={"ID":"2cc78166-8f87-496d-a3a8-8a6aaf3c0173","Type":"ContainerDied","Data":"0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e"} Apr 16 13:16:27.844654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.844243 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f8b46df5b-nj7bv" event={"ID":"2cc78166-8f87-496d-a3a8-8a6aaf3c0173","Type":"ContainerDied","Data":"7969f8364a6a97b650bb9b2fc846fb9d202221e5f1d20ec54b853358c1f5f32a"} Apr 16 13:16:27.844654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.844249 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f8b46df5b-nj7bv" Apr 16 13:16:27.844654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.844257 2574 scope.go:117] "RemoveContainer" containerID="0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e" Apr 16 13:16:27.853188 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.853165 2574 scope.go:117] "RemoveContainer" containerID="0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e" Apr 16 13:16:27.853520 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:16:27.853502 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e\": container with ID starting with 0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e not found: ID does not exist" containerID="0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e" Apr 16 13:16:27.853586 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.853530 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e"} err="failed to get container status \"0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e\": rpc error: code = NotFound desc = could not find container \"0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e\": container with ID starting with 0cfdd1b2864cbe659113c1ee3174bb838dd98614ae1c0a29e75e320cc9c8935e not found: ID does not exist" Apr 16 13:16:27.867598 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.867569 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f8b46df5b-nj7bv"] Apr 16 13:16:27.872629 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.872592 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f8b46df5b-nj7bv"] Apr 16 13:16:27.882651 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:27.882618 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc78166-8f87-496d-a3a8-8a6aaf3c0173" path="/var/lib/kubelet/pods/2cc78166-8f87-496d-a3a8-8a6aaf3c0173/volumes" Apr 16 13:16:37.769035 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:37.768999 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:16:37.769579 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:37.769112 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:16:37.773654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:37.773624 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:16:37.773837 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:37.773679 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:16:37.780254 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:37.780229 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 13:16:41.519652 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.519618 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4"] Apr 16 13:16:41.521417 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.520076 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2cc78166-8f87-496d-a3a8-8a6aaf3c0173" containerName="console" Apr 16 13:16:41.521417 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.520093 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc78166-8f87-496d-a3a8-8a6aaf3c0173" containerName="console" Apr 16 13:16:41.521417 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.520157 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2cc78166-8f87-496d-a3a8-8a6aaf3c0173" containerName="console" Apr 16 13:16:41.522441 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.522423 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.525376 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.525351 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 13:16:41.525524 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.525491 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 13:16:41.525524 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.525517 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-682jg\"" Apr 16 13:16:41.531830 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.531804 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4"] Apr 16 13:16:41.651143 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.651099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wplqt\" (UniqueName: \"kubernetes.io/projected/b7a04265-ca31-4210-8d55-76bdc4dfa32c-kube-api-access-wplqt\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.651360 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.651165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.651360 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.651292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.752577 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.752524 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.752577 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.752583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wplqt\" (UniqueName: \"kubernetes.io/projected/b7a04265-ca31-4210-8d55-76bdc4dfa32c-kube-api-access-wplqt\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.752797 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.752652 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.752935 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.752915 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.752988 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.752968 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.761620 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.761582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wplqt\" (UniqueName: \"kubernetes.io/projected/b7a04265-ca31-4210-8d55-76bdc4dfa32c-kube-api-access-wplqt\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.832860 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.832754 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:41.957619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.957591 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4"] Apr 16 13:16:41.959567 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:16:41.959538 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7a04265_ca31_4210_8d55_76bdc4dfa32c.slice/crio-2797f755fb0f100113bbff797e535c170c7d83648e37b393d3d4ff63b759681c WatchSource:0}: Error finding container 2797f755fb0f100113bbff797e535c170c7d83648e37b393d3d4ff63b759681c: Status 404 returned error can't find the container with id 2797f755fb0f100113bbff797e535c170c7d83648e37b393d3d4ff63b759681c Apr 16 13:16:41.961573 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:41.961557 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:16:42.891646 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:42.891604 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" event={"ID":"b7a04265-ca31-4210-8d55-76bdc4dfa32c","Type":"ContainerStarted","Data":"2797f755fb0f100113bbff797e535c170c7d83648e37b393d3d4ff63b759681c"} Apr 16 13:16:47.909602 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:47.909568 2574 generic.go:358] "Generic (PLEG): container finished" podID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerID="d3b9c2b752d5c94938d04618c452c3870f33ce776403eb5d0470ab4423d53561" exitCode=0 Apr 16 13:16:47.910003 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:47.909651 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" event={"ID":"b7a04265-ca31-4210-8d55-76bdc4dfa32c","Type":"ContainerDied","Data":"d3b9c2b752d5c94938d04618c452c3870f33ce776403eb5d0470ab4423d53561"} Apr 16 13:16:49.918263 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:49.918229 2574 generic.go:358] "Generic (PLEG): container finished" podID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerID="2dde9cbb99fcdfd1bb6d958c8d02ac78117c9ce17f23469b870267d6cb723440" exitCode=0 Apr 16 13:16:49.918676 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:49.918311 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" event={"ID":"b7a04265-ca31-4210-8d55-76bdc4dfa32c","Type":"ContainerDied","Data":"2dde9cbb99fcdfd1bb6d958c8d02ac78117c9ce17f23469b870267d6cb723440"} Apr 16 13:16:55.939829 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:55.939778 2574 generic.go:358] "Generic (PLEG): container finished" podID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerID="7585ee6792e908d9c343641a102f79efe334cd6e77020732a3f7fa677faa2918" exitCode=0 Apr 16 13:16:55.940231 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:55.939874 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" event={"ID":"b7a04265-ca31-4210-8d55-76bdc4dfa32c","Type":"ContainerDied","Data":"7585ee6792e908d9c343641a102f79efe334cd6e77020732a3f7fa677faa2918"} Apr 16 13:16:57.066925 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.066897 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:16:57.088561 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.088528 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-bundle\") pod \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " Apr 16 13:16:57.088743 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.088579 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wplqt\" (UniqueName: \"kubernetes.io/projected/b7a04265-ca31-4210-8d55-76bdc4dfa32c-kube-api-access-wplqt\") pod \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " Apr 16 13:16:57.088743 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.088609 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-util\") pod \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\" (UID: \"b7a04265-ca31-4210-8d55-76bdc4dfa32c\") " Apr 16 13:16:57.089184 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.089157 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-bundle" (OuterVolumeSpecName: "bundle") pod "b7a04265-ca31-4210-8d55-76bdc4dfa32c" (UID: "b7a04265-ca31-4210-8d55-76bdc4dfa32c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:16:57.090921 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.090891 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a04265-ca31-4210-8d55-76bdc4dfa32c-kube-api-access-wplqt" (OuterVolumeSpecName: "kube-api-access-wplqt") pod "b7a04265-ca31-4210-8d55-76bdc4dfa32c" (UID: "b7a04265-ca31-4210-8d55-76bdc4dfa32c"). InnerVolumeSpecName "kube-api-access-wplqt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:16:57.094132 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.094100 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-util" (OuterVolumeSpecName: "util") pod "b7a04265-ca31-4210-8d55-76bdc4dfa32c" (UID: "b7a04265-ca31-4210-8d55-76bdc4dfa32c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:16:57.190199 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.190157 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:57.190199 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.190191 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wplqt\" (UniqueName: \"kubernetes.io/projected/b7a04265-ca31-4210-8d55-76bdc4dfa32c-kube-api-access-wplqt\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:57.190199 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.190203 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b7a04265-ca31-4210-8d55-76bdc4dfa32c-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:16:57.947154 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.947121 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" event={"ID":"b7a04265-ca31-4210-8d55-76bdc4dfa32c","Type":"ContainerDied","Data":"2797f755fb0f100113bbff797e535c170c7d83648e37b393d3d4ff63b759681c"} Apr 16 13:16:57.947154 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.947158 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2797f755fb0f100113bbff797e535c170c7d83648e37b393d3d4ff63b759681c" Apr 16 13:16:57.947357 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:16:57.947166 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vklf4" Apr 16 13:17:03.858968 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.858926 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p"] Apr 16 13:17:03.859439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.859401 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerName="util" Apr 16 13:17:03.859439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.859420 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerName="util" Apr 16 13:17:03.859596 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.859439 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerName="extract" Apr 16 13:17:03.859596 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.859448 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerName="extract" Apr 16 13:17:03.859596 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.859487 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerName="pull" Apr 16 13:17:03.859596 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.859497 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerName="pull" Apr 16 13:17:03.859596 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.859586 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7a04265-ca31-4210-8d55-76bdc4dfa32c" containerName="extract" Apr 16 13:17:03.862502 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.862448 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" Apr 16 13:17:03.867170 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.867147 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-ln69w\"" Apr 16 13:17:03.867293 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.867178 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:17:03.867293 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.867207 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 13:17:03.875309 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.875286 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p"] Apr 16 13:17:03.943984 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.943948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/840d9ed3-8ae7-4e2c-9239-2d046f4632e2-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8ln2p\" (UID: \"840d9ed3-8ae7-4e2c-9239-2d046f4632e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" Apr 16 13:17:03.944413 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:03.943991 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh9wq\" (UniqueName: \"kubernetes.io/projected/840d9ed3-8ae7-4e2c-9239-2d046f4632e2-kube-api-access-xh9wq\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8ln2p\" (UID: \"840d9ed3-8ae7-4e2c-9239-2d046f4632e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" Apr 16 13:17:04.045379 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:04.045332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/840d9ed3-8ae7-4e2c-9239-2d046f4632e2-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8ln2p\" (UID: \"840d9ed3-8ae7-4e2c-9239-2d046f4632e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" Apr 16 13:17:04.045379 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:04.045380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xh9wq\" (UniqueName: \"kubernetes.io/projected/840d9ed3-8ae7-4e2c-9239-2d046f4632e2-kube-api-access-xh9wq\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8ln2p\" (UID: \"840d9ed3-8ae7-4e2c-9239-2d046f4632e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" Apr 16 13:17:04.045782 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:04.045760 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/840d9ed3-8ae7-4e2c-9239-2d046f4632e2-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8ln2p\" (UID: \"840d9ed3-8ae7-4e2c-9239-2d046f4632e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" Apr 16 13:17:04.053979 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:04.053947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh9wq\" (UniqueName: \"kubernetes.io/projected/840d9ed3-8ae7-4e2c-9239-2d046f4632e2-kube-api-access-xh9wq\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-8ln2p\" (UID: \"840d9ed3-8ae7-4e2c-9239-2d046f4632e2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" Apr 16 13:17:04.171721 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:04.171691 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" Apr 16 13:17:04.305608 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:04.305584 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p"] Apr 16 13:17:04.307740 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:17:04.307715 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod840d9ed3_8ae7_4e2c_9239_2d046f4632e2.slice/crio-e94d8a8a128b01a1f1ce2254f4c2304be6368fa44bcf5acce982a206ef370fa2 WatchSource:0}: Error finding container e94d8a8a128b01a1f1ce2254f4c2304be6368fa44bcf5acce982a206ef370fa2: Status 404 returned error can't find the container with id e94d8a8a128b01a1f1ce2254f4c2304be6368fa44bcf5acce982a206ef370fa2 Apr 16 13:17:04.969849 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:04.969808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" event={"ID":"840d9ed3-8ae7-4e2c-9239-2d046f4632e2","Type":"ContainerStarted","Data":"e94d8a8a128b01a1f1ce2254f4c2304be6368fa44bcf5acce982a206ef370fa2"} Apr 16 13:17:06.977940 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:06.977909 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" event={"ID":"840d9ed3-8ae7-4e2c-9239-2d046f4632e2","Type":"ContainerStarted","Data":"89a1244adc2afe391b3e69cf67aade9b4cf714766692c13e3ecd57b8ef394f92"} Apr 16 13:17:07.002889 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:07.002819 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-8ln2p" podStartSLOduration=2.232813962 podStartE2EDuration="4.00279965s" podCreationTimestamp="2026-04-16 13:17:03 +0000 UTC" firstStartedPulling="2026-04-16 13:17:04.310045799 +0000 UTC m=+326.999977596" lastFinishedPulling="2026-04-16 13:17:06.080031488 +0000 UTC m=+328.769963284" observedRunningTime="2026-04-16 13:17:07.001654524 +0000 UTC m=+329.691586339" watchObservedRunningTime="2026-04-16 13:17:07.00279965 +0000 UTC m=+329.692731466" Apr 16 13:17:08.478087 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.478045 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52"] Apr 16 13:17:08.480685 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.480662 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.483536 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.483511 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 13:17:08.483643 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.483543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-682jg\"" Apr 16 13:17:08.484909 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.484892 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 13:17:08.492614 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.492591 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52"] Apr 16 13:17:08.582546 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.582508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb5lk\" (UniqueName: \"kubernetes.io/projected/3898eb1a-c28a-42dd-aaaf-57cf24de7941-kube-api-access-sb5lk\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.582729 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.582574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.582729 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.582655 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.683735 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.683684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.683735 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.683744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sb5lk\" (UniqueName: \"kubernetes.io/projected/3898eb1a-c28a-42dd-aaaf-57cf24de7941-kube-api-access-sb5lk\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.683967 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.683793 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.684081 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.684059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.684136 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.684081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.692106 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.692081 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb5lk\" (UniqueName: \"kubernetes.io/projected/3898eb1a-c28a-42dd-aaaf-57cf24de7941-kube-api-access-sb5lk\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.789416 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.789322 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:08.912685 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.912656 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52"] Apr 16 13:17:08.914491 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:17:08.914448 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3898eb1a_c28a_42dd_aaaf_57cf24de7941.slice/crio-25f0e02a2901922b28873e45ec2dc29b3c0395bc989b5a6f52a4a62699861257 WatchSource:0}: Error finding container 25f0e02a2901922b28873e45ec2dc29b3c0395bc989b5a6f52a4a62699861257: Status 404 returned error can't find the container with id 25f0e02a2901922b28873e45ec2dc29b3c0395bc989b5a6f52a4a62699861257 Apr 16 13:17:08.986165 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.986130 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" event={"ID":"3898eb1a-c28a-42dd-aaaf-57cf24de7941","Type":"ContainerStarted","Data":"89a82338e0bd14760af5a76070db9cbe3c2637db7cb9e48aa2cfc5c77d4f2a61"} Apr 16 13:17:08.986308 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:08.986172 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" event={"ID":"3898eb1a-c28a-42dd-aaaf-57cf24de7941","Type":"ContainerStarted","Data":"25f0e02a2901922b28873e45ec2dc29b3c0395bc989b5a6f52a4a62699861257"} Apr 16 13:17:09.969857 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:09.969773 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-52mjs"] Apr 16 13:17:09.972286 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:09.972271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:09.974889 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:09.974870 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 13:17:09.976235 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:09.976218 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 13:17:09.976328 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:09.976271 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-bcfbt\"" Apr 16 13:17:09.982312 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:09.982287 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-52mjs"] Apr 16 13:17:09.990799 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:09.990772 2574 generic.go:358] "Generic (PLEG): container finished" podID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerID="89a82338e0bd14760af5a76070db9cbe3c2637db7cb9e48aa2cfc5c77d4f2a61" exitCode=0 Apr 16 13:17:09.990894 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:09.990814 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" event={"ID":"3898eb1a-c28a-42dd-aaaf-57cf24de7941","Type":"ContainerDied","Data":"89a82338e0bd14760af5a76070db9cbe3c2637db7cb9e48aa2cfc5c77d4f2a61"} Apr 16 13:17:10.094862 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:10.094826 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ecfa8054-22c4-4266-8957-827ef74624b2-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-52mjs\" (UID: \"ecfa8054-22c4-4266-8957-827ef74624b2\") " pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:10.095047 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:10.094885 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b4zh\" (UniqueName: \"kubernetes.io/projected/ecfa8054-22c4-4266-8957-827ef74624b2-kube-api-access-7b4zh\") pod \"cert-manager-webhook-597b96b99b-52mjs\" (UID: \"ecfa8054-22c4-4266-8957-827ef74624b2\") " pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:10.195860 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:10.195817 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b4zh\" (UniqueName: \"kubernetes.io/projected/ecfa8054-22c4-4266-8957-827ef74624b2-kube-api-access-7b4zh\") pod \"cert-manager-webhook-597b96b99b-52mjs\" (UID: \"ecfa8054-22c4-4266-8957-827ef74624b2\") " pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:10.196042 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:10.195895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ecfa8054-22c4-4266-8957-827ef74624b2-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-52mjs\" (UID: \"ecfa8054-22c4-4266-8957-827ef74624b2\") " pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:10.204058 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:10.204026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ecfa8054-22c4-4266-8957-827ef74624b2-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-52mjs\" (UID: \"ecfa8054-22c4-4266-8957-827ef74624b2\") " pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:10.204189 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:10.204142 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b4zh\" (UniqueName: \"kubernetes.io/projected/ecfa8054-22c4-4266-8957-827ef74624b2-kube-api-access-7b4zh\") pod \"cert-manager-webhook-597b96b99b-52mjs\" (UID: \"ecfa8054-22c4-4266-8957-827ef74624b2\") " pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:10.293966 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:10.293879 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:10.417443 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:10.417414 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-52mjs"] Apr 16 13:17:10.419996 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:17:10.419972 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecfa8054_22c4_4266_8957_827ef74624b2.slice/crio-82d698082405c4d7db904b6d0b8fed79d58fc823569adeb0a919c76664292fc0 WatchSource:0}: Error finding container 82d698082405c4d7db904b6d0b8fed79d58fc823569adeb0a919c76664292fc0: Status 404 returned error can't find the container with id 82d698082405c4d7db904b6d0b8fed79d58fc823569adeb0a919c76664292fc0 Apr 16 13:17:10.995633 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:10.995598 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" event={"ID":"ecfa8054-22c4-4266-8957-827ef74624b2","Type":"ContainerStarted","Data":"82d698082405c4d7db904b6d0b8fed79d58fc823569adeb0a919c76664292fc0"} Apr 16 13:17:12.521429 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.521391 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-6xmcz"] Apr 16 13:17:12.524058 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.524036 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" Apr 16 13:17:12.526977 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.526953 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-hcqbh\"" Apr 16 13:17:12.534164 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.534141 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-6xmcz"] Apr 16 13:17:12.617468 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.617409 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c110972-b91a-47ae-a2f1-c8bb2bfe2979-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-6xmcz\" (UID: \"1c110972-b91a-47ae-a2f1-c8bb2bfe2979\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" Apr 16 13:17:12.617658 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.617490 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8btc\" (UniqueName: \"kubernetes.io/projected/1c110972-b91a-47ae-a2f1-c8bb2bfe2979-kube-api-access-b8btc\") pod \"cert-manager-cainjector-8966b78d4-6xmcz\" (UID: \"1c110972-b91a-47ae-a2f1-c8bb2bfe2979\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" Apr 16 13:17:12.719103 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.719062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c110972-b91a-47ae-a2f1-c8bb2bfe2979-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-6xmcz\" (UID: \"1c110972-b91a-47ae-a2f1-c8bb2bfe2979\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" Apr 16 13:17:12.719287 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.719116 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8btc\" (UniqueName: \"kubernetes.io/projected/1c110972-b91a-47ae-a2f1-c8bb2bfe2979-kube-api-access-b8btc\") pod \"cert-manager-cainjector-8966b78d4-6xmcz\" (UID: \"1c110972-b91a-47ae-a2f1-c8bb2bfe2979\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" Apr 16 13:17:12.728497 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.728440 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1c110972-b91a-47ae-a2f1-c8bb2bfe2979-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-6xmcz\" (UID: \"1c110972-b91a-47ae-a2f1-c8bb2bfe2979\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" Apr 16 13:17:12.729297 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.729242 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8btc\" (UniqueName: \"kubernetes.io/projected/1c110972-b91a-47ae-a2f1-c8bb2bfe2979-kube-api-access-b8btc\") pod \"cert-manager-cainjector-8966b78d4-6xmcz\" (UID: \"1c110972-b91a-47ae-a2f1-c8bb2bfe2979\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" Apr 16 13:17:12.835109 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:12.834783 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" Apr 16 13:17:14.597672 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:14.597640 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-6xmcz"] Apr 16 13:17:15.012553 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:15.012437 2574 generic.go:358] "Generic (PLEG): container finished" podID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerID="f788bc973568a4d30c08bc6b1382f07b2318b078e0ef26b518f9214005463cb7" exitCode=0 Apr 16 13:17:15.012553 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:15.012488 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" event={"ID":"3898eb1a-c28a-42dd-aaaf-57cf24de7941","Type":"ContainerDied","Data":"f788bc973568a4d30c08bc6b1382f07b2318b078e0ef26b518f9214005463cb7"} Apr 16 13:17:15.014144 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:15.014100 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" event={"ID":"1c110972-b91a-47ae-a2f1-c8bb2bfe2979","Type":"ContainerStarted","Data":"7b844df88991ecad230b7d1e0874027754843786b970ae0868f75633d1098adb"} Apr 16 13:17:15.014144 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:15.014140 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" event={"ID":"1c110972-b91a-47ae-a2f1-c8bb2bfe2979","Type":"ContainerStarted","Data":"09fc4551d5f0473a74d9e54cb6ded45c8baf6597df221e01e04c51c0e2a2578a"} Apr 16 13:17:15.015818 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:15.015782 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" event={"ID":"ecfa8054-22c4-4266-8957-827ef74624b2","Type":"ContainerStarted","Data":"edba1ab7c3c3fc85c4386570048a2e5cd2acdedb55ab2ce5ee873ed6c27f5ace"} Apr 16 13:17:15.016797 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:15.016773 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:15.046768 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:15.046724 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" podStartSLOduration=1.99606125 podStartE2EDuration="6.046707923s" podCreationTimestamp="2026-04-16 13:17:09 +0000 UTC" firstStartedPulling="2026-04-16 13:17:10.421924856 +0000 UTC m=+333.111856652" lastFinishedPulling="2026-04-16 13:17:14.472571514 +0000 UTC m=+337.162503325" observedRunningTime="2026-04-16 13:17:15.045362108 +0000 UTC m=+337.735293924" watchObservedRunningTime="2026-04-16 13:17:15.046707923 +0000 UTC m=+337.736639737" Apr 16 13:17:15.060752 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:15.060704 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-6xmcz" podStartSLOduration=3.060690713 podStartE2EDuration="3.060690713s" podCreationTimestamp="2026-04-16 13:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:17:15.059570811 +0000 UTC m=+337.749502627" watchObservedRunningTime="2026-04-16 13:17:15.060690713 +0000 UTC m=+337.750622529" Apr 16 13:17:16.020226 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:16.020194 2574 generic.go:358] "Generic (PLEG): container finished" podID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerID="e9d8709395baf4d04f868be60017869666dd4e79c6c5e11487771ea3afab0deb" exitCode=0 Apr 16 13:17:16.020662 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:16.020282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" event={"ID":"3898eb1a-c28a-42dd-aaaf-57cf24de7941","Type":"ContainerDied","Data":"e9d8709395baf4d04f868be60017869666dd4e79c6c5e11487771ea3afab0deb"} Apr 16 13:17:17.155932 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.155901 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:17.259633 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.259589 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb5lk\" (UniqueName: \"kubernetes.io/projected/3898eb1a-c28a-42dd-aaaf-57cf24de7941-kube-api-access-sb5lk\") pod \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " Apr 16 13:17:17.259805 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.259669 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-util\") pod \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " Apr 16 13:17:17.259805 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.259687 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-bundle\") pod \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\" (UID: \"3898eb1a-c28a-42dd-aaaf-57cf24de7941\") " Apr 16 13:17:17.260110 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.260087 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-bundle" (OuterVolumeSpecName: "bundle") pod "3898eb1a-c28a-42dd-aaaf-57cf24de7941" (UID: "3898eb1a-c28a-42dd-aaaf-57cf24de7941"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:17:17.261750 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.261723 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3898eb1a-c28a-42dd-aaaf-57cf24de7941-kube-api-access-sb5lk" (OuterVolumeSpecName: "kube-api-access-sb5lk") pod "3898eb1a-c28a-42dd-aaaf-57cf24de7941" (UID: "3898eb1a-c28a-42dd-aaaf-57cf24de7941"). InnerVolumeSpecName "kube-api-access-sb5lk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:17:17.264822 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.264787 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-util" (OuterVolumeSpecName: "util") pod "3898eb1a-c28a-42dd-aaaf-57cf24de7941" (UID: "3898eb1a-c28a-42dd-aaaf-57cf24de7941"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:17:17.360567 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.360441 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sb5lk\" (UniqueName: \"kubernetes.io/projected/3898eb1a-c28a-42dd-aaaf-57cf24de7941-kube-api-access-sb5lk\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:17:17.360567 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.360508 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:17:17.360567 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:17.360518 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3898eb1a-c28a-42dd-aaaf-57cf24de7941-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:17:18.029040 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:18.029000 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" event={"ID":"3898eb1a-c28a-42dd-aaaf-57cf24de7941","Type":"ContainerDied","Data":"25f0e02a2901922b28873e45ec2dc29b3c0395bc989b5a6f52a4a62699861257"} Apr 16 13:17:18.029040 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:18.029034 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f0e02a2901922b28873e45ec2dc29b3c0395bc989b5a6f52a4a62699861257" Apr 16 13:17:18.029240 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:18.029052 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ftcv52" Apr 16 13:17:21.023380 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:21.023344 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-52mjs" Apr 16 13:17:27.095709 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.095677 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82"] Apr 16 13:17:27.096194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.096175 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerName="pull" Apr 16 13:17:27.096269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.096198 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerName="pull" Apr 16 13:17:27.096269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.096222 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerName="extract" Apr 16 13:17:27.096269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.096233 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerName="extract" Apr 16 13:17:27.096269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.096246 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerName="util" Apr 16 13:17:27.096269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.096254 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerName="util" Apr 16 13:17:27.096544 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.096341 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3898eb1a-c28a-42dd-aaaf-57cf24de7941" containerName="extract" Apr 16 13:17:27.103687 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.103648 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" Apr 16 13:17:27.106301 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.106269 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82"] Apr 16 13:17:27.106646 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.106625 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:17:27.106748 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.106625 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 13:17:27.108330 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.108315 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-ftdpk\"" Apr 16 13:17:27.241986 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.241947 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57xk\" (UniqueName: \"kubernetes.io/projected/189eea4b-3b7e-4a55-a80e-4b5b36a2bdac-kube-api-access-v57xk\") pod \"openshift-lws-operator-bfc7f696d-zqq82\" (UID: \"189eea4b-3b7e-4a55-a80e-4b5b36a2bdac\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" Apr 16 13:17:27.242176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.242024 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/189eea4b-3b7e-4a55-a80e-4b5b36a2bdac-tmp\") pod \"openshift-lws-operator-bfc7f696d-zqq82\" (UID: \"189eea4b-3b7e-4a55-a80e-4b5b36a2bdac\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" Apr 16 13:17:27.342989 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.342944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v57xk\" (UniqueName: \"kubernetes.io/projected/189eea4b-3b7e-4a55-a80e-4b5b36a2bdac-kube-api-access-v57xk\") pod \"openshift-lws-operator-bfc7f696d-zqq82\" (UID: \"189eea4b-3b7e-4a55-a80e-4b5b36a2bdac\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" Apr 16 13:17:27.343164 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.343023 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/189eea4b-3b7e-4a55-a80e-4b5b36a2bdac-tmp\") pod \"openshift-lws-operator-bfc7f696d-zqq82\" (UID: \"189eea4b-3b7e-4a55-a80e-4b5b36a2bdac\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" Apr 16 13:17:27.343348 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.343331 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/189eea4b-3b7e-4a55-a80e-4b5b36a2bdac-tmp\") pod \"openshift-lws-operator-bfc7f696d-zqq82\" (UID: \"189eea4b-3b7e-4a55-a80e-4b5b36a2bdac\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" Apr 16 13:17:27.351842 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.351793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57xk\" (UniqueName: \"kubernetes.io/projected/189eea4b-3b7e-4a55-a80e-4b5b36a2bdac-kube-api-access-v57xk\") pod \"openshift-lws-operator-bfc7f696d-zqq82\" (UID: \"189eea4b-3b7e-4a55-a80e-4b5b36a2bdac\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" Apr 16 13:17:27.414133 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.414099 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" Apr 16 13:17:27.562490 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:27.562437 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82"] Apr 16 13:17:27.564087 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:17:27.564054 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod189eea4b_3b7e_4a55_a80e_4b5b36a2bdac.slice/crio-f2423c6f5a01e56555e22334b14b78e386e70eb8044cecfca7e5ef38283cf0ab WatchSource:0}: Error finding container f2423c6f5a01e56555e22334b14b78e386e70eb8044cecfca7e5ef38283cf0ab: Status 404 returned error can't find the container with id f2423c6f5a01e56555e22334b14b78e386e70eb8044cecfca7e5ef38283cf0ab Apr 16 13:17:28.064829 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:28.064790 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" event={"ID":"189eea4b-3b7e-4a55-a80e-4b5b36a2bdac","Type":"ContainerStarted","Data":"f2423c6f5a01e56555e22334b14b78e386e70eb8044cecfca7e5ef38283cf0ab"} Apr 16 13:17:31.082884 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:31.082847 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" event={"ID":"189eea4b-3b7e-4a55-a80e-4b5b36a2bdac","Type":"ContainerStarted","Data":"40fcf2830ced206b21724c414849673fdc5bbf5924f78153c287a0f7c1942598"} Apr 16 13:17:31.101865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:31.101814 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zqq82" podStartSLOduration=1.706996511 podStartE2EDuration="4.101792506s" podCreationTimestamp="2026-04-16 13:17:27 +0000 UTC" firstStartedPulling="2026-04-16 13:17:27.565831939 +0000 UTC m=+350.255763747" lastFinishedPulling="2026-04-16 13:17:29.960627949 +0000 UTC m=+352.650559742" observedRunningTime="2026-04-16 13:17:31.101013519 +0000 UTC m=+353.790945335" watchObservedRunningTime="2026-04-16 13:17:31.101792506 +0000 UTC m=+353.791724320" Apr 16 13:17:34.675138 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.675102 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf"] Apr 16 13:17:34.679079 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.679061 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.682082 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.682055 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-682jg\"" Apr 16 13:17:34.682224 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.682091 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 13:17:34.682224 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.682117 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 13:17:34.685825 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.685800 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf"] Apr 16 13:17:34.806589 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.806547 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbf97\" (UniqueName: \"kubernetes.io/projected/d7161e6e-3334-4b3c-9335-5a6137cef774-kube-api-access-rbf97\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.806769 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.806599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.806769 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.806718 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.908013 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.907975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbf97\" (UniqueName: \"kubernetes.io/projected/d7161e6e-3334-4b3c-9335-5a6137cef774-kube-api-access-rbf97\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.908193 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.908024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.908193 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.908079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.908575 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.908556 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.908575 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.908567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.916913 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.916882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbf97\" (UniqueName: \"kubernetes.io/projected/d7161e6e-3334-4b3c-9335-5a6137cef774-kube-api-access-rbf97\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:34.989915 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:34.989807 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:35.113468 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:35.113426 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf"] Apr 16 13:17:35.115594 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:17:35.115565 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7161e6e_3334_4b3c_9335_5a6137cef774.slice/crio-b4ae22e34bb9372f3faf621e173756a585f3f142ca8d826647c7d4e876f36433 WatchSource:0}: Error finding container b4ae22e34bb9372f3faf621e173756a585f3f142ca8d826647c7d4e876f36433: Status 404 returned error can't find the container with id b4ae22e34bb9372f3faf621e173756a585f3f142ca8d826647c7d4e876f36433 Apr 16 13:17:36.102546 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:36.102498 2574 generic.go:358] "Generic (PLEG): container finished" podID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerID="bfe301723aa743b300f7a5f812cdd61272c5a9a4fd34c5177b128c18fe100f8a" exitCode=0 Apr 16 13:17:36.102931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:36.102560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" event={"ID":"d7161e6e-3334-4b3c-9335-5a6137cef774","Type":"ContainerDied","Data":"bfe301723aa743b300f7a5f812cdd61272c5a9a4fd34c5177b128c18fe100f8a"} Apr 16 13:17:36.102931 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:36.102589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" event={"ID":"d7161e6e-3334-4b3c-9335-5a6137cef774","Type":"ContainerStarted","Data":"b4ae22e34bb9372f3faf621e173756a585f3f142ca8d826647c7d4e876f36433"} Apr 16 13:17:37.107532 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:37.107426 2574 generic.go:358] "Generic (PLEG): container finished" podID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerID="b9d56082c33ed2fdbbfaf7c3d0da17887424566d170c3ba77c1d1a385e252ded" exitCode=0 Apr 16 13:17:37.107532 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:37.107514 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" event={"ID":"d7161e6e-3334-4b3c-9335-5a6137cef774","Type":"ContainerDied","Data":"b9d56082c33ed2fdbbfaf7c3d0da17887424566d170c3ba77c1d1a385e252ded"} Apr 16 13:17:37.871661 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:37.871617 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l"] Apr 16 13:17:37.876015 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:37.875990 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:37.879224 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:37.879204 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 13:17:37.880805 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:37.880782 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-vt9jf\"" Apr 16 13:17:37.880990 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:37.880895 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 13:17:37.881285 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:37.880916 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 13:17:37.884802 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:37.884779 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l"] Apr 16 13:17:38.034583 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.034548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-manager-config\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.034729 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.034619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-cert\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.034729 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.034663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5mmd\" (UniqueName: \"kubernetes.io/projected/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-kube-api-access-n5mmd\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.034729 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.034682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-metrics-cert\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.113201 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.113169 2574 generic.go:358] "Generic (PLEG): container finished" podID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerID="53ac9013c404bd028c54c8f69fca6fb681ab8845b39f2c26c9f3de58c474227a" exitCode=0 Apr 16 13:17:38.113570 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.113257 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" event={"ID":"d7161e6e-3334-4b3c-9335-5a6137cef774","Type":"ContainerDied","Data":"53ac9013c404bd028c54c8f69fca6fb681ab8845b39f2c26c9f3de58c474227a"} Apr 16 13:17:38.136183 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.136092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-cert\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.136183 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.136145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5mmd\" (UniqueName: \"kubernetes.io/projected/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-kube-api-access-n5mmd\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.136183 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.136169 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-metrics-cert\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.136492 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.136250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-manager-config\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.136972 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.136948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-manager-config\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.138868 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.138839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-metrics-cert\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.138985 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.138877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-cert\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.150202 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.150177 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5mmd\" (UniqueName: \"kubernetes.io/projected/1d3dd9f4-e5b4-4733-af71-8c614aa6cb16-kube-api-access-n5mmd\") pod \"lws-controller-manager-64f4647cd-6vs9l\" (UID: \"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.187241 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.187194 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:38.313914 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:38.313888 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l"] Apr 16 13:17:38.316576 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:17:38.316543 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d3dd9f4_e5b4_4733_af71_8c614aa6cb16.slice/crio-ca58bff30626966885b403de95cc1551171b9cb55301e6284fa93fa61aeedaac WatchSource:0}: Error finding container ca58bff30626966885b403de95cc1551171b9cb55301e6284fa93fa61aeedaac: Status 404 returned error can't find the container with id ca58bff30626966885b403de95cc1551171b9cb55301e6284fa93fa61aeedaac Apr 16 13:17:39.117891 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.117842 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" event={"ID":"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16","Type":"ContainerStarted","Data":"ca58bff30626966885b403de95cc1551171b9cb55301e6284fa93fa61aeedaac"} Apr 16 13:17:39.252150 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.252125 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:39.347088 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.347052 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-util\") pod \"d7161e6e-3334-4b3c-9335-5a6137cef774\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " Apr 16 13:17:39.347269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.347116 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbf97\" (UniqueName: \"kubernetes.io/projected/d7161e6e-3334-4b3c-9335-5a6137cef774-kube-api-access-rbf97\") pod \"d7161e6e-3334-4b3c-9335-5a6137cef774\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " Apr 16 13:17:39.347269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.347206 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-bundle\") pod \"d7161e6e-3334-4b3c-9335-5a6137cef774\" (UID: \"d7161e6e-3334-4b3c-9335-5a6137cef774\") " Apr 16 13:17:39.348054 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.348020 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-bundle" (OuterVolumeSpecName: "bundle") pod "d7161e6e-3334-4b3c-9335-5a6137cef774" (UID: "d7161e6e-3334-4b3c-9335-5a6137cef774"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:17:39.349324 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.349303 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7161e6e-3334-4b3c-9335-5a6137cef774-kube-api-access-rbf97" (OuterVolumeSpecName: "kube-api-access-rbf97") pod "d7161e6e-3334-4b3c-9335-5a6137cef774" (UID: "d7161e6e-3334-4b3c-9335-5a6137cef774"). InnerVolumeSpecName "kube-api-access-rbf97". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:17:39.352779 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.352755 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-util" (OuterVolumeSpecName: "util") pod "d7161e6e-3334-4b3c-9335-5a6137cef774" (UID: "d7161e6e-3334-4b3c-9335-5a6137cef774"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:17:39.448091 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.448043 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:17:39.448091 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.448089 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d7161e6e-3334-4b3c-9335-5a6137cef774-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:17:39.448293 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:39.448105 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbf97\" (UniqueName: \"kubernetes.io/projected/d7161e6e-3334-4b3c-9335-5a6137cef774-kube-api-access-rbf97\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:17:40.123891 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:40.123843 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" event={"ID":"d7161e6e-3334-4b3c-9335-5a6137cef774","Type":"ContainerDied","Data":"b4ae22e34bb9372f3faf621e173756a585f3f142ca8d826647c7d4e876f36433"} Apr 16 13:17:40.123891 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:40.123896 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4ae22e34bb9372f3faf621e173756a585f3f142ca8d826647c7d4e876f36433" Apr 16 13:17:40.124317 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:40.123861 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5fpdzf" Apr 16 13:17:42.131560 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:42.131522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" event={"ID":"1d3dd9f4-e5b4-4733-af71-8c614aa6cb16","Type":"ContainerStarted","Data":"357848ef5388abc17ef417c42ea396403a938c6d7ab45b842e761e6122c598ef"} Apr 16 13:17:42.131935 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:42.131637 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:17:42.151274 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:42.151218 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" podStartSLOduration=2.395419436 podStartE2EDuration="5.151203823s" podCreationTimestamp="2026-04-16 13:17:37 +0000 UTC" firstStartedPulling="2026-04-16 13:17:38.318392719 +0000 UTC m=+361.008324512" lastFinishedPulling="2026-04-16 13:17:41.0741771 +0000 UTC m=+363.764108899" observedRunningTime="2026-04-16 13:17:42.149205943 +0000 UTC m=+364.839137759" watchObservedRunningTime="2026-04-16 13:17:42.151203823 +0000 UTC m=+364.841135637" Apr 16 13:17:45.109666 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.109635 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw"] Apr 16 13:17:45.110152 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.110111 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerName="util" Apr 16 13:17:45.110152 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.110130 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerName="util" Apr 16 13:17:45.110632 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.110157 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerName="pull" Apr 16 13:17:45.110632 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.110167 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerName="pull" Apr 16 13:17:45.110632 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.110176 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerName="extract" Apr 16 13:17:45.110632 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.110184 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerName="extract" Apr 16 13:17:45.110632 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.110273 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d7161e6e-3334-4b3c-9335-5a6137cef774" containerName="extract" Apr 16 13:17:45.113807 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.113786 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.116774 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.116747 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-682jg\"" Apr 16 13:17:45.116879 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.116775 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 13:17:45.118156 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.118139 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 13:17:45.124425 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.124404 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw"] Apr 16 13:17:45.298883 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.298846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.299037 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.298897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.299037 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.298999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xmbm\" (UniqueName: \"kubernetes.io/projected/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-kube-api-access-6xmbm\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.399896 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.399798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.399896 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.399860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.400148 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.399919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xmbm\" (UniqueName: \"kubernetes.io/projected/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-kube-api-access-6xmbm\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.400210 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.400178 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.400291 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.400270 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.409941 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.409912 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xmbm\" (UniqueName: \"kubernetes.io/projected/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-kube-api-access-6xmbm\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.423773 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.423745 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:45.562487 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:45.562443 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw"] Apr 16 13:17:45.563210 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:17:45.563181 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5f4ea48_a64b_4942_8969_00bb9fb4d5be.slice/crio-c2bc506e9a041c1081439ae67239aa9db44a8471cf5b4eace97a1a83790ed008 WatchSource:0}: Error finding container c2bc506e9a041c1081439ae67239aa9db44a8471cf5b4eace97a1a83790ed008: Status 404 returned error can't find the container with id c2bc506e9a041c1081439ae67239aa9db44a8471cf5b4eace97a1a83790ed008 Apr 16 13:17:46.145821 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.145783 2574 generic.go:358] "Generic (PLEG): container finished" podID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerID="4b6747f99b26de14438f35a2eca3822433817020adb3e9b95336b997f0cb6ea1" exitCode=0 Apr 16 13:17:46.146306 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.145876 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" event={"ID":"f5f4ea48-a64b-4942-8969-00bb9fb4d5be","Type":"ContainerDied","Data":"4b6747f99b26de14438f35a2eca3822433817020adb3e9b95336b997f0cb6ea1"} Apr 16 13:17:46.146306 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.145911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" event={"ID":"f5f4ea48-a64b-4942-8969-00bb9fb4d5be","Type":"ContainerStarted","Data":"c2bc506e9a041c1081439ae67239aa9db44a8471cf5b4eace97a1a83790ed008"} Apr 16 13:17:46.927303 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.927268 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm"] Apr 16 13:17:46.930788 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.930766 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:46.934352 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.934329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 13:17:46.934506 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.934437 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 13:17:46.934588 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.934573 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 13:17:46.934673 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.934654 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 13:17:46.934732 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.934710 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-5gs69\"" Apr 16 13:17:46.943911 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:46.943884 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm"] Apr 16 13:17:47.114814 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.114720 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgrt2\" (UniqueName: \"kubernetes.io/projected/1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2-kube-api-access-wgrt2\") pod \"opendatahub-operator-controller-manager-5889847794-k6jmm\" (UID: \"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.114814 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.114792 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5889847794-k6jmm\" (UID: \"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.114814 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.114818 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2-webhook-cert\") pod \"opendatahub-operator-controller-manager-5889847794-k6jmm\" (UID: \"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.151028 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.150989 2574 generic.go:358] "Generic (PLEG): container finished" podID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerID="bd601e1e0899e1d88106d4173b0f6eaf6f0d3594794018f3008fcbf166a0854d" exitCode=0 Apr 16 13:17:47.151445 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.151070 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" event={"ID":"f5f4ea48-a64b-4942-8969-00bb9fb4d5be","Type":"ContainerDied","Data":"bd601e1e0899e1d88106d4173b0f6eaf6f0d3594794018f3008fcbf166a0854d"} Apr 16 13:17:47.215674 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.215640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5889847794-k6jmm\" (UID: \"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.215822 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.215686 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2-webhook-cert\") pod \"opendatahub-operator-controller-manager-5889847794-k6jmm\" (UID: \"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.215822 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.215763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgrt2\" (UniqueName: \"kubernetes.io/projected/1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2-kube-api-access-wgrt2\") pod \"opendatahub-operator-controller-manager-5889847794-k6jmm\" (UID: \"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.218047 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.218020 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5889847794-k6jmm\" (UID: \"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.218196 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.218098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2-webhook-cert\") pod \"opendatahub-operator-controller-manager-5889847794-k6jmm\" (UID: \"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.224796 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.224768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgrt2\" (UniqueName: \"kubernetes.io/projected/1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2-kube-api-access-wgrt2\") pod \"opendatahub-operator-controller-manager-5889847794-k6jmm\" (UID: \"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.241361 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.241330 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:47.378164 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:47.378134 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm"] Apr 16 13:17:47.380755 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:17:47.380732 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c3f9ca4_b8e1_4737_a5e5_e3b1115f17f2.slice/crio-75d749ad0459eedd0603c9a4f684870f98b51a3e790ae4b9c7b850047ba45f03 WatchSource:0}: Error finding container 75d749ad0459eedd0603c9a4f684870f98b51a3e790ae4b9c7b850047ba45f03: Status 404 returned error can't find the container with id 75d749ad0459eedd0603c9a4f684870f98b51a3e790ae4b9c7b850047ba45f03 Apr 16 13:17:48.156300 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:48.156264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" event={"ID":"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2","Type":"ContainerStarted","Data":"75d749ad0459eedd0603c9a4f684870f98b51a3e790ae4b9c7b850047ba45f03"} Apr 16 13:17:48.158557 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:48.158517 2574 generic.go:358] "Generic (PLEG): container finished" podID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerID="f613670982f99b3a2253b40e22bd3fabbff6d263f12e5d32f9bf3bd3c87a5de1" exitCode=0 Apr 16 13:17:48.158712 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:48.158575 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" event={"ID":"f5f4ea48-a64b-4942-8969-00bb9fb4d5be","Type":"ContainerDied","Data":"f613670982f99b3a2253b40e22bd3fabbff6d263f12e5d32f9bf3bd3c87a5de1"} Apr 16 13:17:49.402245 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.402223 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:49.434660 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.434623 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-bundle\") pod \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " Apr 16 13:17:49.434851 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.434699 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xmbm\" (UniqueName: \"kubernetes.io/projected/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-kube-api-access-6xmbm\") pod \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " Apr 16 13:17:49.434851 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.434772 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-util\") pod \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\" (UID: \"f5f4ea48-a64b-4942-8969-00bb9fb4d5be\") " Apr 16 13:17:49.435774 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.435742 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-bundle" (OuterVolumeSpecName: "bundle") pod "f5f4ea48-a64b-4942-8969-00bb9fb4d5be" (UID: "f5f4ea48-a64b-4942-8969-00bb9fb4d5be"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:17:49.437374 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.437349 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-kube-api-access-6xmbm" (OuterVolumeSpecName: "kube-api-access-6xmbm") pod "f5f4ea48-a64b-4942-8969-00bb9fb4d5be" (UID: "f5f4ea48-a64b-4942-8969-00bb9fb4d5be"). InnerVolumeSpecName "kube-api-access-6xmbm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:17:49.442955 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.442923 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-util" (OuterVolumeSpecName: "util") pod "f5f4ea48-a64b-4942-8969-00bb9fb4d5be" (UID: "f5f4ea48-a64b-4942-8969-00bb9fb4d5be"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:17:49.535531 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.535493 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:17:49.535531 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.535532 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6xmbm\" (UniqueName: \"kubernetes.io/projected/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-kube-api-access-6xmbm\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:17:49.535753 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:49.535549 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f5f4ea48-a64b-4942-8969-00bb9fb4d5be-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:17:50.167993 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:50.167956 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" event={"ID":"1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2","Type":"ContainerStarted","Data":"5996259b87748d7829c1f148de75a7895169dd1dcc3b4bd4adc8850f7f617137"} Apr 16 13:17:50.168181 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:50.168046 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:17:50.169956 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:50.169928 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" event={"ID":"f5f4ea48-a64b-4942-8969-00bb9fb4d5be","Type":"ContainerDied","Data":"c2bc506e9a041c1081439ae67239aa9db44a8471cf5b4eace97a1a83790ed008"} Apr 16 13:17:50.169956 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:50.169957 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2bc506e9a041c1081439ae67239aa9db44a8471cf5b4eace97a1a83790ed008" Apr 16 13:17:50.170144 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:50.170025 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c94fxjw" Apr 16 13:17:50.189796 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:50.189747 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" podStartSLOduration=1.540906973 podStartE2EDuration="4.189734479s" podCreationTimestamp="2026-04-16 13:17:46 +0000 UTC" firstStartedPulling="2026-04-16 13:17:47.382525707 +0000 UTC m=+370.072457500" lastFinishedPulling="2026-04-16 13:17:50.031353209 +0000 UTC m=+372.721285006" observedRunningTime="2026-04-16 13:17:50.187521845 +0000 UTC m=+372.877453661" watchObservedRunningTime="2026-04-16 13:17:50.189734479 +0000 UTC m=+372.879666293" Apr 16 13:17:53.137773 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:17:53.137737 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-6vs9l" Apr 16 13:18:01.176798 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:01.176769 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-k6jmm" Apr 16 13:18:13.783689 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.783655 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm"] Apr 16 13:18:13.784189 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.784167 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerName="extract" Apr 16 13:18:13.784189 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.784185 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerName="extract" Apr 16 13:18:13.784302 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.784204 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerName="util" Apr 16 13:18:13.784302 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.784212 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerName="util" Apr 16 13:18:13.784302 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.784242 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerName="pull" Apr 16 13:18:13.784302 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.784251 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerName="pull" Apr 16 13:18:13.784528 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.784337 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f5f4ea48-a64b-4942-8969-00bb9fb4d5be" containerName="extract" Apr 16 13:18:13.789193 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.789171 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:13.792267 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.792230 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 13:18:13.792446 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.792424 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 13:18:13.793888 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.793853 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-682jg\"" Apr 16 13:18:13.796393 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.796367 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm"] Apr 16 13:18:13.843340 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.843302 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:13.843570 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.843383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:13.843570 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.843413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldcd\" (UniqueName: \"kubernetes.io/projected/e901fad7-313d-4445-8a83-b655020868b7-kube-api-access-qldcd\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:13.944799 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.944754 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:13.944989 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.944828 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:13.944989 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.944857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qldcd\" (UniqueName: \"kubernetes.io/projected/e901fad7-313d-4445-8a83-b655020868b7-kube-api-access-qldcd\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:13.945152 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.945133 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:13.945239 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.945219 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:13.954151 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:13.954129 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldcd\" (UniqueName: \"kubernetes.io/projected/e901fad7-313d-4445-8a83-b655020868b7-kube-api-access-qldcd\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:14.099642 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.099555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:14.223144 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.223113 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm"] Apr 16 13:18:14.225249 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:18:14.225222 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode901fad7_313d_4445_8a83_b655020868b7.slice/crio-14d281d5342a07ac246c7c19c731d38ffb04d5bffa9ae012c1c68eee9d6867b0 WatchSource:0}: Error finding container 14d281d5342a07ac246c7c19c731d38ffb04d5bffa9ae012c1c68eee9d6867b0: Status 404 returned error can't find the container with id 14d281d5342a07ac246c7c19c731d38ffb04d5bffa9ae012c1c68eee9d6867b0 Apr 16 13:18:14.260268 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.260238 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" event={"ID":"e901fad7-313d-4445-8a83-b655020868b7","Type":"ContainerStarted","Data":"14d281d5342a07ac246c7c19c731d38ffb04d5bffa9ae012c1c68eee9d6867b0"} Apr 16 13:18:14.331118 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.331082 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7"] Apr 16 13:18:14.334330 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.334307 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.337083 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.337058 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 13:18:14.337420 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.337378 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-4bnc5\"" Apr 16 13:18:14.337559 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.337433 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 13:18:14.337559 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.337437 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 13:18:14.337675 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.337635 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 13:18:14.341075 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.341055 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7"] Apr 16 13:18:14.448953 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.448922 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b81c2ff0-12e1-4362-88f7-83c3c2bd95d7-tmp\") pod \"kube-auth-proxy-7485ccd7bf-rwxp7\" (UID: \"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.449108 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.448967 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lklm\" (UniqueName: \"kubernetes.io/projected/b81c2ff0-12e1-4362-88f7-83c3c2bd95d7-kube-api-access-4lklm\") pod \"kube-auth-proxy-7485ccd7bf-rwxp7\" (UID: \"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.449108 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.449030 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b81c2ff0-12e1-4362-88f7-83c3c2bd95d7-tls-certs\") pod \"kube-auth-proxy-7485ccd7bf-rwxp7\" (UID: \"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.549792 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.549752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b81c2ff0-12e1-4362-88f7-83c3c2bd95d7-tmp\") pod \"kube-auth-proxy-7485ccd7bf-rwxp7\" (UID: \"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.549792 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.549796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lklm\" (UniqueName: \"kubernetes.io/projected/b81c2ff0-12e1-4362-88f7-83c3c2bd95d7-kube-api-access-4lklm\") pod \"kube-auth-proxy-7485ccd7bf-rwxp7\" (UID: \"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.550032 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.549814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b81c2ff0-12e1-4362-88f7-83c3c2bd95d7-tls-certs\") pod \"kube-auth-proxy-7485ccd7bf-rwxp7\" (UID: \"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.552004 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.551981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b81c2ff0-12e1-4362-88f7-83c3c2bd95d7-tmp\") pod \"kube-auth-proxy-7485ccd7bf-rwxp7\" (UID: \"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.552120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.552105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b81c2ff0-12e1-4362-88f7-83c3c2bd95d7-tls-certs\") pod \"kube-auth-proxy-7485ccd7bf-rwxp7\" (UID: \"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.558699 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.558676 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lklm\" (UniqueName: \"kubernetes.io/projected/b81c2ff0-12e1-4362-88f7-83c3c2bd95d7-kube-api-access-4lklm\") pod \"kube-auth-proxy-7485ccd7bf-rwxp7\" (UID: \"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7\") " pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.677188 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.677145 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" Apr 16 13:18:14.801940 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:14.801908 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7"] Apr 16 13:18:14.803791 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:18:14.803762 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb81c2ff0_12e1_4362_88f7_83c3c2bd95d7.slice/crio-715a9823bd9a77a763d821d6ad13396c485fe10cfa6199945e854998a39e35c8 WatchSource:0}: Error finding container 715a9823bd9a77a763d821d6ad13396c485fe10cfa6199945e854998a39e35c8: Status 404 returned error can't find the container with id 715a9823bd9a77a763d821d6ad13396c485fe10cfa6199945e854998a39e35c8 Apr 16 13:18:15.264793 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:15.264705 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" event={"ID":"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7","Type":"ContainerStarted","Data":"715a9823bd9a77a763d821d6ad13396c485fe10cfa6199945e854998a39e35c8"} Apr 16 13:18:15.265936 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:15.265914 2574 generic.go:358] "Generic (PLEG): container finished" podID="e901fad7-313d-4445-8a83-b655020868b7" containerID="b90a4ad46be711ba282d06c31a6befdd88a3803f3bf71fa5dabdf3184aa6288d" exitCode=0 Apr 16 13:18:15.266073 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:15.266008 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" event={"ID":"e901fad7-313d-4445-8a83-b655020868b7","Type":"ContainerDied","Data":"b90a4ad46be711ba282d06c31a6befdd88a3803f3bf71fa5dabdf3184aa6288d"} Apr 16 13:18:17.276978 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:17.276941 2574 generic.go:358] "Generic (PLEG): container finished" podID="e901fad7-313d-4445-8a83-b655020868b7" containerID="3010d2b7a911b6b297e5b957e356275f0afb461abe0d0f04c038d4b1d06b48e8" exitCode=0 Apr 16 13:18:17.277380 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:17.277011 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" event={"ID":"e901fad7-313d-4445-8a83-b655020868b7","Type":"ContainerDied","Data":"3010d2b7a911b6b297e5b957e356275f0afb461abe0d0f04c038d4b1d06b48e8"} Apr 16 13:18:18.282887 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:18.282844 2574 generic.go:358] "Generic (PLEG): container finished" podID="e901fad7-313d-4445-8a83-b655020868b7" containerID="e265d617e0ba02ef52a632300e32ada2962f22ae4c6aa5045b82c777d583857f" exitCode=0 Apr 16 13:18:18.283333 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:18.282921 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" event={"ID":"e901fad7-313d-4445-8a83-b655020868b7","Type":"ContainerDied","Data":"e265d617e0ba02ef52a632300e32ada2962f22ae4c6aa5045b82c777d583857f"} Apr 16 13:18:18.284398 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:18.284371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" event={"ID":"b81c2ff0-12e1-4362-88f7-83c3c2bd95d7","Type":"ContainerStarted","Data":"38add01bc52d2e14594759cb8e3e41c42bcc8f3bf6a2e0258221cb32ab41f7ff"} Apr 16 13:18:18.319291 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:18.319232 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7485ccd7bf-rwxp7" podStartSLOduration=1.1786047 podStartE2EDuration="4.319214779s" podCreationTimestamp="2026-04-16 13:18:14 +0000 UTC" firstStartedPulling="2026-04-16 13:18:14.805512854 +0000 UTC m=+397.495444647" lastFinishedPulling="2026-04-16 13:18:17.946122933 +0000 UTC m=+400.636054726" observedRunningTime="2026-04-16 13:18:18.317954178 +0000 UTC m=+401.007885992" watchObservedRunningTime="2026-04-16 13:18:18.319214779 +0000 UTC m=+401.009146621" Apr 16 13:18:19.414405 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.414379 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:19.502738 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.502702 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qldcd\" (UniqueName: \"kubernetes.io/projected/e901fad7-313d-4445-8a83-b655020868b7-kube-api-access-qldcd\") pod \"e901fad7-313d-4445-8a83-b655020868b7\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " Apr 16 13:18:19.502952 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.502764 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-bundle\") pod \"e901fad7-313d-4445-8a83-b655020868b7\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " Apr 16 13:18:19.502952 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.502792 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-util\") pod \"e901fad7-313d-4445-8a83-b655020868b7\" (UID: \"e901fad7-313d-4445-8a83-b655020868b7\") " Apr 16 13:18:19.503658 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.503631 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-bundle" (OuterVolumeSpecName: "bundle") pod "e901fad7-313d-4445-8a83-b655020868b7" (UID: "e901fad7-313d-4445-8a83-b655020868b7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:18:19.504930 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.504906 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e901fad7-313d-4445-8a83-b655020868b7-kube-api-access-qldcd" (OuterVolumeSpecName: "kube-api-access-qldcd") pod "e901fad7-313d-4445-8a83-b655020868b7" (UID: "e901fad7-313d-4445-8a83-b655020868b7"). InnerVolumeSpecName "kube-api-access-qldcd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:18:19.596316 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.596230 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-util" (OuterVolumeSpecName: "util") pod "e901fad7-313d-4445-8a83-b655020868b7" (UID: "e901fad7-313d-4445-8a83-b655020868b7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:18:19.604131 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.604105 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qldcd\" (UniqueName: \"kubernetes.io/projected/e901fad7-313d-4445-8a83-b655020868b7-kube-api-access-qldcd\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:18:19.604259 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.604134 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:18:19.604259 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:19.604150 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e901fad7-313d-4445-8a83-b655020868b7-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:18:20.296230 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:20.296193 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" event={"ID":"e901fad7-313d-4445-8a83-b655020868b7","Type":"ContainerDied","Data":"14d281d5342a07ac246c7c19c731d38ffb04d5bffa9ae012c1c68eee9d6867b0"} Apr 16 13:18:20.296230 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:20.296229 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14d281d5342a07ac246c7c19c731d38ffb04d5bffa9ae012c1c68eee9d6867b0" Apr 16 13:18:20.296230 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:20.296210 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835x7fcm" Apr 16 13:18:27.659435 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.659404 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94"] Apr 16 13:18:27.659927 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.659791 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e901fad7-313d-4445-8a83-b655020868b7" containerName="pull" Apr 16 13:18:27.659927 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.659803 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e901fad7-313d-4445-8a83-b655020868b7" containerName="pull" Apr 16 13:18:27.659927 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.659813 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e901fad7-313d-4445-8a83-b655020868b7" containerName="extract" Apr 16 13:18:27.659927 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.659818 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e901fad7-313d-4445-8a83-b655020868b7" containerName="extract" Apr 16 13:18:27.659927 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.659840 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e901fad7-313d-4445-8a83-b655020868b7" containerName="util" Apr 16 13:18:27.659927 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.659845 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e901fad7-313d-4445-8a83-b655020868b7" containerName="util" Apr 16 13:18:27.659927 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.659913 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e901fad7-313d-4445-8a83-b655020868b7" containerName="extract" Apr 16 13:18:27.669174 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.669150 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.673640 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.673603 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 13:18:27.673936 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.673918 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 13:18:27.675248 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.675223 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-682jg\"" Apr 16 13:18:27.682208 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.682183 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94"] Apr 16 13:18:27.779086 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.779052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbdwf\" (UniqueName: \"kubernetes.io/projected/d12e5421-1e0b-4092-b85d-e6eeef3620d7-kube-api-access-bbdwf\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.779261 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.779103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.779261 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.779178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.880210 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.880173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbdwf\" (UniqueName: \"kubernetes.io/projected/d12e5421-1e0b-4092-b85d-e6eeef3620d7-kube-api-access-bbdwf\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.880387 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.880220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.880387 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.880269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.880747 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.880725 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.880747 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.880739 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.889276 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.889249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbdwf\" (UniqueName: \"kubernetes.io/projected/d12e5421-1e0b-4092-b85d-e6eeef3620d7-kube-api-access-bbdwf\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:27.981567 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:27.981524 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:28.116145 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:28.116119 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94"] Apr 16 13:18:28.117977 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:18:28.117953 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd12e5421_1e0b_4092_b85d_e6eeef3620d7.slice/crio-3646eb26b1383995b59017863c3440abfd81d70936179dd22e893242f97001ca WatchSource:0}: Error finding container 3646eb26b1383995b59017863c3440abfd81d70936179dd22e893242f97001ca: Status 404 returned error can't find the container with id 3646eb26b1383995b59017863c3440abfd81d70936179dd22e893242f97001ca Apr 16 13:18:28.327359 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:28.327331 2574 generic.go:358] "Generic (PLEG): container finished" podID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerID="d50e0da602a369ecf6718e5830fc52328d23ae3fecfdf1c0f6bdbc1bf0e104ee" exitCode=0 Apr 16 13:18:28.327532 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:28.327367 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" event={"ID":"d12e5421-1e0b-4092-b85d-e6eeef3620d7","Type":"ContainerDied","Data":"d50e0da602a369ecf6718e5830fc52328d23ae3fecfdf1c0f6bdbc1bf0e104ee"} Apr 16 13:18:28.327532 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:28.327389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" event={"ID":"d12e5421-1e0b-4092-b85d-e6eeef3620d7","Type":"ContainerStarted","Data":"3646eb26b1383995b59017863c3440abfd81d70936179dd22e893242f97001ca"} Apr 16 13:18:29.332613 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:29.332576 2574 generic.go:358] "Generic (PLEG): container finished" podID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerID="2022ff2c36620df9fd23aaa3a4731761fd8027c0496e62d73b0da9f38df8da3a" exitCode=0 Apr 16 13:18:29.333012 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:29.332684 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" event={"ID":"d12e5421-1e0b-4092-b85d-e6eeef3620d7","Type":"ContainerDied","Data":"2022ff2c36620df9fd23aaa3a4731761fd8027c0496e62d73b0da9f38df8da3a"} Apr 16 13:18:30.338521 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:30.338488 2574 generic.go:358] "Generic (PLEG): container finished" podID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerID="a8fb222fcfcd12b7b670de7c966092b85c3e9b0eacbffcf6b5b90039b4ec7f09" exitCode=0 Apr 16 13:18:30.338917 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:30.338617 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" event={"ID":"d12e5421-1e0b-4092-b85d-e6eeef3620d7","Type":"ContainerDied","Data":"a8fb222fcfcd12b7b670de7c966092b85c3e9b0eacbffcf6b5b90039b4ec7f09"} Apr 16 13:18:31.471284 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.471260 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:31.613517 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.613404 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbdwf\" (UniqueName: \"kubernetes.io/projected/d12e5421-1e0b-4092-b85d-e6eeef3620d7-kube-api-access-bbdwf\") pod \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " Apr 16 13:18:31.613517 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.613446 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-bundle\") pod \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " Apr 16 13:18:31.613517 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.613494 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-util\") pod \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\" (UID: \"d12e5421-1e0b-4092-b85d-e6eeef3620d7\") " Apr 16 13:18:31.614962 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.614925 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-bundle" (OuterVolumeSpecName: "bundle") pod "d12e5421-1e0b-4092-b85d-e6eeef3620d7" (UID: "d12e5421-1e0b-4092-b85d-e6eeef3620d7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:18:31.622075 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.622045 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12e5421-1e0b-4092-b85d-e6eeef3620d7-kube-api-access-bbdwf" (OuterVolumeSpecName: "kube-api-access-bbdwf") pod "d12e5421-1e0b-4092-b85d-e6eeef3620d7" (UID: "d12e5421-1e0b-4092-b85d-e6eeef3620d7"). InnerVolumeSpecName "kube-api-access-bbdwf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:18:31.622236 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.622211 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-util" (OuterVolumeSpecName: "util") pod "d12e5421-1e0b-4092-b85d-e6eeef3620d7" (UID: "d12e5421-1e0b-4092-b85d-e6eeef3620d7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:18:31.715028 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.714982 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bbdwf\" (UniqueName: \"kubernetes.io/projected/d12e5421-1e0b-4092-b85d-e6eeef3620d7-kube-api-access-bbdwf\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:18:31.715028 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.715019 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:18:31.715028 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:31.715030 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d12e5421-1e0b-4092-b85d-e6eeef3620d7-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:18:32.348112 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:32.348071 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" event={"ID":"d12e5421-1e0b-4092-b85d-e6eeef3620d7","Type":"ContainerDied","Data":"3646eb26b1383995b59017863c3440abfd81d70936179dd22e893242f97001ca"} Apr 16 13:18:32.348112 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:32.348101 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2bzr94" Apr 16 13:18:32.348112 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:18:32.348110 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3646eb26b1383995b59017863c3440abfd81d70936179dd22e893242f97001ca" Apr 16 13:19:30.465333 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.465297 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs"] Apr 16 13:19:30.465803 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.465669 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerName="pull" Apr 16 13:19:30.465803 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.465681 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerName="pull" Apr 16 13:19:30.465803 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.465691 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerName="util" Apr 16 13:19:30.465803 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.465696 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerName="util" Apr 16 13:19:30.465803 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.465706 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerName="extract" Apr 16 13:19:30.465803 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.465711 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerName="extract" Apr 16 13:19:30.465803 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.465784 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d12e5421-1e0b-4092-b85d-e6eeef3620d7" containerName="extract" Apr 16 13:19:30.468981 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.468963 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.471919 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.471890 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 13:19:30.472074 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.471984 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 13:19:30.472074 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.472055 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-b5hqc\"" Apr 16 13:19:30.477382 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.477357 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs"] Apr 16 13:19:30.621822 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.621782 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.622007 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.621848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.622007 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.621888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfghc\" (UniqueName: \"kubernetes.io/projected/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-kube-api-access-mfghc\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.722884 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.722794 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.722884 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.722856 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfghc\" (UniqueName: \"kubernetes.io/projected/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-kube-api-access-mfghc\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.723118 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.722889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.723176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.723136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.723226 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.723187 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.732207 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.732179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfghc\" (UniqueName: \"kubernetes.io/projected/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-kube-api-access-mfghc\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.779745 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.779709 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:30.905215 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:30.905189 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs"] Apr 16 13:19:30.907108 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:19:30.907083 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dc9b953_175a_4ee0_b859_e7b0e9d03c19.slice/crio-bd9df9b782173910df37ac20733aea889f19ce0bd947287219bebcb1ace7044a WatchSource:0}: Error finding container bd9df9b782173910df37ac20733aea889f19ce0bd947287219bebcb1ace7044a: Status 404 returned error can't find the container with id bd9df9b782173910df37ac20733aea889f19ce0bd947287219bebcb1ace7044a Apr 16 13:19:31.062198 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.062158 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9"] Apr 16 13:19:31.065586 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.065564 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.073059 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.073035 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9"] Apr 16 13:19:31.226655 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.226610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.226838 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.226741 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.226838 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.226795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mqvz\" (UniqueName: \"kubernetes.io/projected/29e9d28a-1ec6-47ae-ad10-b068e096acd7-kube-api-access-9mqvz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.328136 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.328046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mqvz\" (UniqueName: \"kubernetes.io/projected/29e9d28a-1ec6-47ae-ad10-b068e096acd7-kube-api-access-9mqvz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.328136 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.328099 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.328324 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.328153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.328556 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.328540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.328592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.328570 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.336727 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.336699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mqvz\" (UniqueName: \"kubernetes.io/projected/29e9d28a-1ec6-47ae-ad10-b068e096acd7-kube-api-access-9mqvz\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.390573 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.390542 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:31.513090 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.513062 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9"] Apr 16 13:19:31.514058 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:19:31.514027 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29e9d28a_1ec6_47ae_ad10_b068e096acd7.slice/crio-9bbb7c5461279427062771b176fcb806dcf39ff8223b98df750d6d174e0b7aaa WatchSource:0}: Error finding container 9bbb7c5461279427062771b176fcb806dcf39ff8223b98df750d6d174e0b7aaa: Status 404 returned error can't find the container with id 9bbb7c5461279427062771b176fcb806dcf39ff8223b98df750d6d174e0b7aaa Apr 16 13:19:31.568168 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.568134 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" event={"ID":"29e9d28a-1ec6-47ae-ad10-b068e096acd7","Type":"ContainerStarted","Data":"9bbb7c5461279427062771b176fcb806dcf39ff8223b98df750d6d174e0b7aaa"} Apr 16 13:19:31.569631 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.569607 2574 generic.go:358] "Generic (PLEG): container finished" podID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerID="719b7df6200a8ed72218f5e9578551bb5577261111c12d3709b08fecc2cbd22f" exitCode=0 Apr 16 13:19:31.569743 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.569680 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" event={"ID":"2dc9b953-175a-4ee0-b859-e7b0e9d03c19","Type":"ContainerDied","Data":"719b7df6200a8ed72218f5e9578551bb5577261111c12d3709b08fecc2cbd22f"} Apr 16 13:19:31.569743 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.569710 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" event={"ID":"2dc9b953-175a-4ee0-b859-e7b0e9d03c19","Type":"ContainerStarted","Data":"bd9df9b782173910df37ac20733aea889f19ce0bd947287219bebcb1ace7044a"} Apr 16 13:19:31.657830 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.657799 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4"] Apr 16 13:19:31.661509 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.661494 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.668184 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.668160 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4"] Apr 16 13:19:31.832776 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.832733 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqplf\" (UniqueName: \"kubernetes.io/projected/aaca4958-bbc5-4859-8e1a-aff64b58930f-kube-api-access-mqplf\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.832971 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.832834 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.832971 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.832890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.933792 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.933757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqplf\" (UniqueName: \"kubernetes.io/projected/aaca4958-bbc5-4859-8e1a-aff64b58930f-kube-api-access-mqplf\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.933967 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.933827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.933967 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.933861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.934216 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.934197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.934253 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.934238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.942870 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.942844 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqplf\" (UniqueName: \"kubernetes.io/projected/aaca4958-bbc5-4859-8e1a-aff64b58930f-kube-api-access-mqplf\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:31.994361 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:31.994329 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:32.063316 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.063286 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj"] Apr 16 13:19:32.068171 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.068150 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.076363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.076337 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj"] Apr 16 13:19:32.119851 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.119825 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4"] Apr 16 13:19:32.120761 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:19:32.120728 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaca4958_bbc5_4859_8e1a_aff64b58930f.slice/crio-402ea724b39949f9254ab92d146c52c5a564bfdc396de40e79f27ce7da362cf7 WatchSource:0}: Error finding container 402ea724b39949f9254ab92d146c52c5a564bfdc396de40e79f27ce7da362cf7: Status 404 returned error can't find the container with id 402ea724b39949f9254ab92d146c52c5a564bfdc396de40e79f27ce7da362cf7 Apr 16 13:19:32.236122 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.236097 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.236253 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.236131 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx42d\" (UniqueName: \"kubernetes.io/projected/e95696cd-e809-4a69-9157-868b6467640a-kube-api-access-qx42d\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.236294 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.236264 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.337175 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.337137 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.337347 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.337191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.337347 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.337222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx42d\" (UniqueName: \"kubernetes.io/projected/e95696cd-e809-4a69-9157-868b6467640a-kube-api-access-qx42d\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.337641 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.337619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.337694 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.337625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.346665 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.346632 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx42d\" (UniqueName: \"kubernetes.io/projected/e95696cd-e809-4a69-9157-868b6467640a-kube-api-access-qx42d\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.383104 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.383073 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:32.543642 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.543619 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj"] Apr 16 13:19:32.545756 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:19:32.545725 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode95696cd_e809_4a69_9157_868b6467640a.slice/crio-fcf391913888ce46539ed51cd2d6810124f3a9077e4a944ee6c4ba9768fa0d93 WatchSource:0}: Error finding container fcf391913888ce46539ed51cd2d6810124f3a9077e4a944ee6c4ba9768fa0d93: Status 404 returned error can't find the container with id fcf391913888ce46539ed51cd2d6810124f3a9077e4a944ee6c4ba9768fa0d93 Apr 16 13:19:32.574782 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.574753 2574 generic.go:358] "Generic (PLEG): container finished" podID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerID="f5e786ae461bc13f32be8388962260a5308b385db33bb8414c6e77e11a3ff9a2" exitCode=0 Apr 16 13:19:32.574901 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.574829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" event={"ID":"aaca4958-bbc5-4859-8e1a-aff64b58930f","Type":"ContainerDied","Data":"f5e786ae461bc13f32be8388962260a5308b385db33bb8414c6e77e11a3ff9a2"} Apr 16 13:19:32.574901 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.574870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" event={"ID":"aaca4958-bbc5-4859-8e1a-aff64b58930f","Type":"ContainerStarted","Data":"402ea724b39949f9254ab92d146c52c5a564bfdc396de40e79f27ce7da362cf7"} Apr 16 13:19:32.576135 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.576114 2574 generic.go:358] "Generic (PLEG): container finished" podID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerID="b80b6c1fa29f1121c593d3f240a514f1b3e802a65a00c69e26ec4e44e90eddce" exitCode=0 Apr 16 13:19:32.576232 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.576197 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" event={"ID":"29e9d28a-1ec6-47ae-ad10-b068e096acd7","Type":"ContainerDied","Data":"b80b6c1fa29f1121c593d3f240a514f1b3e802a65a00c69e26ec4e44e90eddce"} Apr 16 13:19:32.578049 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.578025 2574 generic.go:358] "Generic (PLEG): container finished" podID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerID="5b79cf40cb2a0676d807d5c05650f86f93e89008dba2459ae0d42bb48b981ddf" exitCode=0 Apr 16 13:19:32.578151 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.578099 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" event={"ID":"2dc9b953-175a-4ee0-b859-e7b0e9d03c19","Type":"ContainerDied","Data":"5b79cf40cb2a0676d807d5c05650f86f93e89008dba2459ae0d42bb48b981ddf"} Apr 16 13:19:32.579381 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:32.579357 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" event={"ID":"e95696cd-e809-4a69-9157-868b6467640a","Type":"ContainerStarted","Data":"fcf391913888ce46539ed51cd2d6810124f3a9077e4a944ee6c4ba9768fa0d93"} Apr 16 13:19:33.590869 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:33.590780 2574 generic.go:358] "Generic (PLEG): container finished" podID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerID="9a3c8fcb923ed06f3cf2d1608dd66b431cb8e44259eeea43c1232de99ff2a04a" exitCode=0 Apr 16 13:19:33.591249 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:33.590865 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" event={"ID":"aaca4958-bbc5-4859-8e1a-aff64b58930f","Type":"ContainerDied","Data":"9a3c8fcb923ed06f3cf2d1608dd66b431cb8e44259eeea43c1232de99ff2a04a"} Apr 16 13:19:33.592574 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:33.592505 2574 generic.go:358] "Generic (PLEG): container finished" podID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerID="73e9db23facc87f85946e48773d0ce2fe85d6cf0ca80ff43fe8d9e249561bfe7" exitCode=0 Apr 16 13:19:33.592652 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:33.592593 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" event={"ID":"29e9d28a-1ec6-47ae-ad10-b068e096acd7","Type":"ContainerDied","Data":"73e9db23facc87f85946e48773d0ce2fe85d6cf0ca80ff43fe8d9e249561bfe7"} Apr 16 13:19:33.594522 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:33.594503 2574 generic.go:358] "Generic (PLEG): container finished" podID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerID="ce2cf97370a06d5059cd6b0ef62a967f6ae16fd2d5dbb068368b4da28f3be205" exitCode=0 Apr 16 13:19:33.594612 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:33.594564 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" event={"ID":"2dc9b953-175a-4ee0-b859-e7b0e9d03c19","Type":"ContainerDied","Data":"ce2cf97370a06d5059cd6b0ef62a967f6ae16fd2d5dbb068368b4da28f3be205"} Apr 16 13:19:33.595935 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:33.595891 2574 generic.go:358] "Generic (PLEG): container finished" podID="e95696cd-e809-4a69-9157-868b6467640a" containerID="7677e1bffb7aeeddd2900b7c16ca2d12275b334a731155f41e092a365204655c" exitCode=0 Apr 16 13:19:33.595935 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:33.595932 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" event={"ID":"e95696cd-e809-4a69-9157-868b6467640a","Type":"ContainerDied","Data":"7677e1bffb7aeeddd2900b7c16ca2d12275b334a731155f41e092a365204655c"} Apr 16 13:19:34.601873 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.601841 2574 generic.go:358] "Generic (PLEG): container finished" podID="e95696cd-e809-4a69-9157-868b6467640a" containerID="58a566ee2ed803d2c360a252fb279d545f92443f8b04d387ae4142e761f169a0" exitCode=0 Apr 16 13:19:34.602301 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.601925 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" event={"ID":"e95696cd-e809-4a69-9157-868b6467640a","Type":"ContainerDied","Data":"58a566ee2ed803d2c360a252fb279d545f92443f8b04d387ae4142e761f169a0"} Apr 16 13:19:34.604153 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.604129 2574 generic.go:358] "Generic (PLEG): container finished" podID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerID="8de26fb18454841774ee27c107696dd04d5bd1efb2c4e772c1ff3a7729037f23" exitCode=0 Apr 16 13:19:34.604257 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.604212 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" event={"ID":"aaca4958-bbc5-4859-8e1a-aff64b58930f","Type":"ContainerDied","Data":"8de26fb18454841774ee27c107696dd04d5bd1efb2c4e772c1ff3a7729037f23"} Apr 16 13:19:34.606079 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.606054 2574 generic.go:358] "Generic (PLEG): container finished" podID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerID="812678f28888e108d0dbbb1bdd7d65702c5aad2d110a746fe950faac8e07b550" exitCode=0 Apr 16 13:19:34.606170 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.606145 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" event={"ID":"29e9d28a-1ec6-47ae-ad10-b068e096acd7","Type":"ContainerDied","Data":"812678f28888e108d0dbbb1bdd7d65702c5aad2d110a746fe950faac8e07b550"} Apr 16 13:19:34.735790 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.735767 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:34.756961 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.756929 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfghc\" (UniqueName: \"kubernetes.io/projected/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-kube-api-access-mfghc\") pod \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " Apr 16 13:19:34.757133 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.756978 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-util\") pod \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " Apr 16 13:19:34.757133 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.757006 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-bundle\") pod \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\" (UID: \"2dc9b953-175a-4ee0-b859-e7b0e9d03c19\") " Apr 16 13:19:34.757839 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.757808 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-bundle" (OuterVolumeSpecName: "bundle") pod "2dc9b953-175a-4ee0-b859-e7b0e9d03c19" (UID: "2dc9b953-175a-4ee0-b859-e7b0e9d03c19"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:19:34.759217 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.759190 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-kube-api-access-mfghc" (OuterVolumeSpecName: "kube-api-access-mfghc") pod "2dc9b953-175a-4ee0-b859-e7b0e9d03c19" (UID: "2dc9b953-175a-4ee0-b859-e7b0e9d03c19"). InnerVolumeSpecName "kube-api-access-mfghc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:19:34.763081 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.763061 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-util" (OuterVolumeSpecName: "util") pod "2dc9b953-175a-4ee0-b859-e7b0e9d03c19" (UID: "2dc9b953-175a-4ee0-b859-e7b0e9d03c19"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:19:34.857659 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.857619 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mfghc\" (UniqueName: \"kubernetes.io/projected/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-kube-api-access-mfghc\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:34.857659 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.857649 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:34.857659 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:34.857662 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2dc9b953-175a-4ee0-b859-e7b0e9d03c19-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:35.611371 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.611333 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" event={"ID":"2dc9b953-175a-4ee0-b859-e7b0e9d03c19","Type":"ContainerDied","Data":"bd9df9b782173910df37ac20733aea889f19ce0bd947287219bebcb1ace7044a"} Apr 16 13:19:35.611371 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.611359 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs" Apr 16 13:19:35.611371 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.611368 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd9df9b782173910df37ac20733aea889f19ce0bd947287219bebcb1ace7044a" Apr 16 13:19:35.613158 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.613134 2574 generic.go:358] "Generic (PLEG): container finished" podID="e95696cd-e809-4a69-9157-868b6467640a" containerID="80c942e90784e5b3bfffc3fc900af0f7dddc9cdf397a7d3c15fb929ac8cfec66" exitCode=0 Apr 16 13:19:35.613270 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.613229 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" event={"ID":"e95696cd-e809-4a69-9157-868b6467640a","Type":"ContainerDied","Data":"80c942e90784e5b3bfffc3fc900af0f7dddc9cdf397a7d3c15fb929ac8cfec66"} Apr 16 13:19:35.747514 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.747491 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:35.764098 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.764074 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mqvz\" (UniqueName: \"kubernetes.io/projected/29e9d28a-1ec6-47ae-ad10-b068e096acd7-kube-api-access-9mqvz\") pod \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " Apr 16 13:19:35.764243 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.764132 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-util\") pod \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " Apr 16 13:19:35.764521 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.764427 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-bundle\") pod \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\" (UID: \"29e9d28a-1ec6-47ae-ad10-b068e096acd7\") " Apr 16 13:19:35.765799 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.765754 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-bundle" (OuterVolumeSpecName: "bundle") pod "29e9d28a-1ec6-47ae-ad10-b068e096acd7" (UID: "29e9d28a-1ec6-47ae-ad10-b068e096acd7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:19:35.767524 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.767496 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e9d28a-1ec6-47ae-ad10-b068e096acd7-kube-api-access-9mqvz" (OuterVolumeSpecName: "kube-api-access-9mqvz") pod "29e9d28a-1ec6-47ae-ad10-b068e096acd7" (UID: "29e9d28a-1ec6-47ae-ad10-b068e096acd7"). InnerVolumeSpecName "kube-api-access-9mqvz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:19:35.774146 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.774116 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-util" (OuterVolumeSpecName: "util") pod "29e9d28a-1ec6-47ae-ad10-b068e096acd7" (UID: "29e9d28a-1ec6-47ae-ad10-b068e096acd7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:19:35.779583 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.779562 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:35.865661 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.865573 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-util\") pod \"aaca4958-bbc5-4859-8e1a-aff64b58930f\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " Apr 16 13:19:35.865661 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.865611 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-bundle\") pod \"aaca4958-bbc5-4859-8e1a-aff64b58930f\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " Apr 16 13:19:35.865661 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.865657 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqplf\" (UniqueName: \"kubernetes.io/projected/aaca4958-bbc5-4859-8e1a-aff64b58930f-kube-api-access-mqplf\") pod \"aaca4958-bbc5-4859-8e1a-aff64b58930f\" (UID: \"aaca4958-bbc5-4859-8e1a-aff64b58930f\") " Apr 16 13:19:35.865910 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.865895 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:35.865965 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.865919 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9mqvz\" (UniqueName: \"kubernetes.io/projected/29e9d28a-1ec6-47ae-ad10-b068e096acd7-kube-api-access-9mqvz\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:35.865965 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.865933 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/29e9d28a-1ec6-47ae-ad10-b068e096acd7-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:35.866216 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.866183 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-bundle" (OuterVolumeSpecName: "bundle") pod "aaca4958-bbc5-4859-8e1a-aff64b58930f" (UID: "aaca4958-bbc5-4859-8e1a-aff64b58930f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:19:35.867898 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.867872 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaca4958-bbc5-4859-8e1a-aff64b58930f-kube-api-access-mqplf" (OuterVolumeSpecName: "kube-api-access-mqplf") pod "aaca4958-bbc5-4859-8e1a-aff64b58930f" (UID: "aaca4958-bbc5-4859-8e1a-aff64b58930f"). InnerVolumeSpecName "kube-api-access-mqplf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:19:35.871039 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.871014 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-util" (OuterVolumeSpecName: "util") pod "aaca4958-bbc5-4859-8e1a-aff64b58930f" (UID: "aaca4958-bbc5-4859-8e1a-aff64b58930f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:19:35.966602 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.966565 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:35.966602 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.966595 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/aaca4958-bbc5-4859-8e1a-aff64b58930f-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:35.966602 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:35.966604 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqplf\" (UniqueName: \"kubernetes.io/projected/aaca4958-bbc5-4859-8e1a-aff64b58930f-kube-api-access-mqplf\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:36.618692 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.618665 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" Apr 16 13:19:36.619131 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.618660 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4" event={"ID":"aaca4958-bbc5-4859-8e1a-aff64b58930f","Type":"ContainerDied","Data":"402ea724b39949f9254ab92d146c52c5a564bfdc396de40e79f27ce7da362cf7"} Apr 16 13:19:36.619131 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.618784 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402ea724b39949f9254ab92d146c52c5a564bfdc396de40e79f27ce7da362cf7" Apr 16 13:19:36.620487 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.620446 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" Apr 16 13:19:36.620649 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.620484 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9" event={"ID":"29e9d28a-1ec6-47ae-ad10-b068e096acd7","Type":"ContainerDied","Data":"9bbb7c5461279427062771b176fcb806dcf39ff8223b98df750d6d174e0b7aaa"} Apr 16 13:19:36.620649 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.620508 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bbb7c5461279427062771b176fcb806dcf39ff8223b98df750d6d174e0b7aaa" Apr 16 13:19:36.749173 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.749150 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:36.775037 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.775007 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx42d\" (UniqueName: \"kubernetes.io/projected/e95696cd-e809-4a69-9157-868b6467640a-kube-api-access-qx42d\") pod \"e95696cd-e809-4a69-9157-868b6467640a\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " Apr 16 13:19:36.775191 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.775046 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-bundle\") pod \"e95696cd-e809-4a69-9157-868b6467640a\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " Apr 16 13:19:36.775191 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.775080 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-util\") pod \"e95696cd-e809-4a69-9157-868b6467640a\" (UID: \"e95696cd-e809-4a69-9157-868b6467640a\") " Apr 16 13:19:36.775592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.775564 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-bundle" (OuterVolumeSpecName: "bundle") pod "e95696cd-e809-4a69-9157-868b6467640a" (UID: "e95696cd-e809-4a69-9157-868b6467640a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:19:36.777482 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.777443 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e95696cd-e809-4a69-9157-868b6467640a-kube-api-access-qx42d" (OuterVolumeSpecName: "kube-api-access-qx42d") pod "e95696cd-e809-4a69-9157-868b6467640a" (UID: "e95696cd-e809-4a69-9157-868b6467640a"). InnerVolumeSpecName "kube-api-access-qx42d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:19:36.783586 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.783484 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-util" (OuterVolumeSpecName: "util") pod "e95696cd-e809-4a69-9157-868b6467640a" (UID: "e95696cd-e809-4a69-9157-868b6467640a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:19:36.876408 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.876316 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qx42d\" (UniqueName: \"kubernetes.io/projected/e95696cd-e809-4a69-9157-868b6467640a-kube-api-access-qx42d\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:36.876408 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.876350 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:36.876408 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:36.876359 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e95696cd-e809-4a69-9157-868b6467640a-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:19:37.625815 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:37.625780 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" event={"ID":"e95696cd-e809-4a69-9157-868b6467640a","Type":"ContainerDied","Data":"fcf391913888ce46539ed51cd2d6810124f3a9077e4a944ee6c4ba9768fa0d93"} Apr 16 13:19:37.625815 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:37.625800 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj" Apr 16 13:19:37.625815 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:37.625817 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcf391913888ce46539ed51cd2d6810124f3a9077e4a944ee6c4ba9768fa0d93" Apr 16 13:19:46.823895 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.823857 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9c44f7d9c-2hwkw"] Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824199 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerName="util" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824209 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerName="util" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824221 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerName="util" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824226 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerName="util" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824235 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e95696cd-e809-4a69-9157-868b6467640a" containerName="util" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824240 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95696cd-e809-4a69-9157-868b6467640a" containerName="util" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824247 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerName="util" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824253 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerName="util" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824258 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerName="pull" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824262 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerName="pull" Apr 16 13:19:46.824266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824268 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824273 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824282 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e95696cd-e809-4a69-9157-868b6467640a" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824287 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95696cd-e809-4a69-9157-868b6467640a" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824294 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e95696cd-e809-4a69-9157-868b6467640a" containerName="pull" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824300 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="e95696cd-e809-4a69-9157-868b6467640a" containerName="pull" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824315 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824320 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824326 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824331 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824338 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerName="pull" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824343 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerName="pull" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824349 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerName="pull" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824354 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerName="pull" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824404 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2dc9b953-175a-4ee0-b859-e7b0e9d03c19" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824411 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="29e9d28a-1ec6-47ae-ad10-b068e096acd7" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824419 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="e95696cd-e809-4a69-9157-868b6467640a" containerName="extract" Apr 16 13:19:46.824621 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.824426 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="aaca4958-bbc5-4859-8e1a-aff64b58930f" containerName="extract" Apr 16 13:19:46.828830 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.828808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.840659 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.840624 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c44f7d9c-2hwkw"] Apr 16 13:19:46.856123 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.856094 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-console-serving-cert\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.856123 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.856125 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-service-ca\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.856340 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.856169 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tss7m\" (UniqueName: \"kubernetes.io/projected/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-kube-api-access-tss7m\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.856340 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.856188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-console-config\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.856340 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.856239 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-oauth-serving-cert\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.856340 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.856274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-console-oauth-config\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.856340 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.856340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-trusted-ca-bundle\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.957616 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.957581 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-trusted-ca-bundle\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.957616 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.957627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-console-serving-cert\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.957905 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.957643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-service-ca\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.957905 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.957679 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tss7m\" (UniqueName: \"kubernetes.io/projected/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-kube-api-access-tss7m\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.957905 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.957696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-console-config\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.957905 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.957733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-oauth-serving-cert\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.957905 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.957751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-console-oauth-config\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.958515 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.958487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-service-ca\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.958651 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.958588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-console-config\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.958651 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.958625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-oauth-serving-cert\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.958756 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.958654 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-trusted-ca-bundle\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.960375 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.960354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-console-oauth-config\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.960479 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.960373 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-console-serving-cert\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:46.965864 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:46.965843 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tss7m\" (UniqueName: \"kubernetes.io/projected/6e1553e0-6e76-4f1d-bf40-5bdeb970c118-kube-api-access-tss7m\") pod \"console-9c44f7d9c-2hwkw\" (UID: \"6e1553e0-6e76-4f1d-bf40-5bdeb970c118\") " pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:47.140891 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:47.140799 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:47.266355 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:47.266330 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9c44f7d9c-2hwkw"] Apr 16 13:19:47.267900 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:19:47.267870 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e1553e0_6e76_4f1d_bf40_5bdeb970c118.slice/crio-d4a5f92a44df868c185b6d96e105b4871c6d24d28a9a5c9c9d7cda17cbea35e8 WatchSource:0}: Error finding container d4a5f92a44df868c185b6d96e105b4871c6d24d28a9a5c9c9d7cda17cbea35e8: Status 404 returned error can't find the container with id d4a5f92a44df868c185b6d96e105b4871c6d24d28a9a5c9c9d7cda17cbea35e8 Apr 16 13:19:47.669532 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:47.669488 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c44f7d9c-2hwkw" event={"ID":"6e1553e0-6e76-4f1d-bf40-5bdeb970c118","Type":"ContainerStarted","Data":"a9b48e28b1e2c47823a71fcfc71bf222e0376e34002ef18ab188abefc30f53ac"} Apr 16 13:19:47.669532 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:47.669536 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9c44f7d9c-2hwkw" event={"ID":"6e1553e0-6e76-4f1d-bf40-5bdeb970c118","Type":"ContainerStarted","Data":"d4a5f92a44df868c185b6d96e105b4871c6d24d28a9a5c9c9d7cda17cbea35e8"} Apr 16 13:19:47.691120 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:47.691069 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9c44f7d9c-2hwkw" podStartSLOduration=1.691055111 podStartE2EDuration="1.691055111s" podCreationTimestamp="2026-04-16 13:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:19:47.689359433 +0000 UTC m=+490.379291280" watchObservedRunningTime="2026-04-16 13:19:47.691055111 +0000 UTC m=+490.380986926" Apr 16 13:19:57.043053 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.043015 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx"] Apr 16 13:19:57.048150 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.048130 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.051116 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.051092 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 13:19:57.051266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.051097 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 13:19:57.051266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.051173 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 13:19:57.052398 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.052382 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 13:19:57.052499 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.052411 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-b5hqc\"" Apr 16 13:19:57.057022 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.057000 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx"] Apr 16 13:19:57.141289 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.141224 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:57.141445 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.141304 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:57.144045 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.144021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27f800cd-853c-4b2b-af56-1a8597700a17-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-988vx\" (UID: \"27f800cd-853c-4b2b-af56-1a8597700a17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.144147 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.144060 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/27f800cd-853c-4b2b-af56-1a8597700a17-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-988vx\" (UID: \"27f800cd-853c-4b2b-af56-1a8597700a17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.144213 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.144192 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg8w8\" (UniqueName: \"kubernetes.io/projected/27f800cd-853c-4b2b-af56-1a8597700a17-kube-api-access-pg8w8\") pod \"kuadrant-console-plugin-6cb54b5c86-988vx\" (UID: \"27f800cd-853c-4b2b-af56-1a8597700a17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.146253 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.146233 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:57.245230 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.245192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg8w8\" (UniqueName: \"kubernetes.io/projected/27f800cd-853c-4b2b-af56-1a8597700a17-kube-api-access-pg8w8\") pod \"kuadrant-console-plugin-6cb54b5c86-988vx\" (UID: \"27f800cd-853c-4b2b-af56-1a8597700a17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.245407 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.245352 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27f800cd-853c-4b2b-af56-1a8597700a17-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-988vx\" (UID: \"27f800cd-853c-4b2b-af56-1a8597700a17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.245501 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.245407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/27f800cd-853c-4b2b-af56-1a8597700a17-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-988vx\" (UID: \"27f800cd-853c-4b2b-af56-1a8597700a17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.246114 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.246092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/27f800cd-853c-4b2b-af56-1a8597700a17-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-988vx\" (UID: \"27f800cd-853c-4b2b-af56-1a8597700a17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.247917 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.247890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/27f800cd-853c-4b2b-af56-1a8597700a17-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-988vx\" (UID: \"27f800cd-853c-4b2b-af56-1a8597700a17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.253855 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.253831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg8w8\" (UniqueName: \"kubernetes.io/projected/27f800cd-853c-4b2b-af56-1a8597700a17-kube-api-access-pg8w8\") pod \"kuadrant-console-plugin-6cb54b5c86-988vx\" (UID: \"27f800cd-853c-4b2b-af56-1a8597700a17\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.359034 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.358939 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" Apr 16 13:19:57.504959 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.504928 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx"] Apr 16 13:19:57.506529 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:19:57.506503 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27f800cd_853c_4b2b_af56_1a8597700a17.slice/crio-8c3b74c1b11cfc505bc93432a59f951c2e4b7c0378314f42c0205b2c42e881ec WatchSource:0}: Error finding container 8c3b74c1b11cfc505bc93432a59f951c2e4b7c0378314f42c0205b2c42e881ec: Status 404 returned error can't find the container with id 8c3b74c1b11cfc505bc93432a59f951c2e4b7c0378314f42c0205b2c42e881ec Apr 16 13:19:57.707218 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.707181 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" event={"ID":"27f800cd-853c-4b2b-af56-1a8597700a17","Type":"ContainerStarted","Data":"8c3b74c1b11cfc505bc93432a59f951c2e4b7c0378314f42c0205b2c42e881ec"} Apr 16 13:19:57.711241 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.711217 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9c44f7d9c-2hwkw" Apr 16 13:19:57.769704 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:19:57.769674 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7db7bfdf6f-5p5zn"] Apr 16 13:20:21.816373 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:21.816339 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" event={"ID":"27f800cd-853c-4b2b-af56-1a8597700a17","Type":"ContainerStarted","Data":"16cee1db07c9a1e9e89f92ffec4e894a1f5ac9d8fb20cf3e2468cbc61c5451c6"} Apr 16 13:20:21.831511 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:21.831439 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-988vx" podStartSLOduration=1.034059118 podStartE2EDuration="24.831419722s" podCreationTimestamp="2026-04-16 13:19:57 +0000 UTC" firstStartedPulling="2026-04-16 13:19:57.507806043 +0000 UTC m=+500.197737835" lastFinishedPulling="2026-04-16 13:20:21.305166631 +0000 UTC m=+523.995098439" observedRunningTime="2026-04-16 13:20:21.830804233 +0000 UTC m=+524.520736052" watchObservedRunningTime="2026-04-16 13:20:21.831419722 +0000 UTC m=+524.521351539" Apr 16 13:20:22.789628 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:22.789569 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7db7bfdf6f-5p5zn" podUID="f22d56e1-1f03-4547-a594-9ae793ac47c4" containerName="console" containerID="cri-o://94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f" gracePeriod=15 Apr 16 13:20:23.086040 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.086013 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7db7bfdf6f-5p5zn_f22d56e1-1f03-4547-a594-9ae793ac47c4/console/0.log" Apr 16 13:20:23.086355 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.086074 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:20:23.190770 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.190731 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-service-ca\") pod \"f22d56e1-1f03-4547-a594-9ae793ac47c4\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " Apr 16 13:20:23.190947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.190798 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-trusted-ca-bundle\") pod \"f22d56e1-1f03-4547-a594-9ae793ac47c4\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " Apr 16 13:20:23.190947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.190819 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp9xq\" (UniqueName: \"kubernetes.io/projected/f22d56e1-1f03-4547-a594-9ae793ac47c4-kube-api-access-jp9xq\") pod \"f22d56e1-1f03-4547-a594-9ae793ac47c4\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " Apr 16 13:20:23.190947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.190838 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-oauth-serving-cert\") pod \"f22d56e1-1f03-4547-a594-9ae793ac47c4\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " Apr 16 13:20:23.190947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.190874 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-serving-cert\") pod \"f22d56e1-1f03-4547-a594-9ae793ac47c4\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " Apr 16 13:20:23.190947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.190902 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-config\") pod \"f22d56e1-1f03-4547-a594-9ae793ac47c4\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " Apr 16 13:20:23.190947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.190927 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-oauth-config\") pod \"f22d56e1-1f03-4547-a594-9ae793ac47c4\" (UID: \"f22d56e1-1f03-4547-a594-9ae793ac47c4\") " Apr 16 13:20:23.191197 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.191174 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-service-ca" (OuterVolumeSpecName: "service-ca") pod "f22d56e1-1f03-4547-a594-9ae793ac47c4" (UID: "f22d56e1-1f03-4547-a594-9ae793ac47c4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:20:23.191620 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.191594 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f22d56e1-1f03-4547-a594-9ae793ac47c4" (UID: "f22d56e1-1f03-4547-a594-9ae793ac47c4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:20:23.191718 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.191593 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-config" (OuterVolumeSpecName: "console-config") pod "f22d56e1-1f03-4547-a594-9ae793ac47c4" (UID: "f22d56e1-1f03-4547-a594-9ae793ac47c4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:20:23.191718 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.191638 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f22d56e1-1f03-4547-a594-9ae793ac47c4" (UID: "f22d56e1-1f03-4547-a594-9ae793ac47c4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:20:23.193514 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.193446 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f22d56e1-1f03-4547-a594-9ae793ac47c4" (UID: "f22d56e1-1f03-4547-a594-9ae793ac47c4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:20:23.193624 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.193584 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f22d56e1-1f03-4547-a594-9ae793ac47c4-kube-api-access-jp9xq" (OuterVolumeSpecName: "kube-api-access-jp9xq") pod "f22d56e1-1f03-4547-a594-9ae793ac47c4" (UID: "f22d56e1-1f03-4547-a594-9ae793ac47c4"). InnerVolumeSpecName "kube-api-access-jp9xq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:20:23.193682 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.193635 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f22d56e1-1f03-4547-a594-9ae793ac47c4" (UID: "f22d56e1-1f03-4547-a594-9ae793ac47c4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:20:23.292476 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.292412 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-config\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:20:23.292476 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.292474 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-oauth-config\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:20:23.292476 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.292485 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-service-ca\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:20:23.292742 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.292495 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-trusted-ca-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:20:23.292742 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.292506 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jp9xq\" (UniqueName: \"kubernetes.io/projected/f22d56e1-1f03-4547-a594-9ae793ac47c4-kube-api-access-jp9xq\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:20:23.292742 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.292515 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f22d56e1-1f03-4547-a594-9ae793ac47c4-oauth-serving-cert\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:20:23.292742 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.292529 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f22d56e1-1f03-4547-a594-9ae793ac47c4-console-serving-cert\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:20:23.825811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.825783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7db7bfdf6f-5p5zn_f22d56e1-1f03-4547-a594-9ae793ac47c4/console/0.log" Apr 16 13:20:23.825981 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.825824 2574 generic.go:358] "Generic (PLEG): container finished" podID="f22d56e1-1f03-4547-a594-9ae793ac47c4" containerID="94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f" exitCode=2 Apr 16 13:20:23.825981 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.825897 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7db7bfdf6f-5p5zn" Apr 16 13:20:23.825981 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.825907 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db7bfdf6f-5p5zn" event={"ID":"f22d56e1-1f03-4547-a594-9ae793ac47c4","Type":"ContainerDied","Data":"94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f"} Apr 16 13:20:23.825981 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.825946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7db7bfdf6f-5p5zn" event={"ID":"f22d56e1-1f03-4547-a594-9ae793ac47c4","Type":"ContainerDied","Data":"b0a168e683f5ca9dfef9476ce1c7071de74b6aa98722234a45b55d2d37f24aea"} Apr 16 13:20:23.825981 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.825962 2574 scope.go:117] "RemoveContainer" containerID="94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f" Apr 16 13:20:23.835399 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.835381 2574 scope.go:117] "RemoveContainer" containerID="94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f" Apr 16 13:20:23.835715 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:20:23.835692 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f\": container with ID starting with 94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f not found: ID does not exist" containerID="94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f" Apr 16 13:20:23.835785 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.835724 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f"} err="failed to get container status \"94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f\": rpc error: code = NotFound desc = could not find container \"94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f\": container with ID starting with 94eb395fc8b4f21763b1531301d39cc60a977cdba7347e4d6090bd965b61568f not found: ID does not exist" Apr 16 13:20:23.853636 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.853603 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7db7bfdf6f-5p5zn"] Apr 16 13:20:23.856081 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.856059 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7db7bfdf6f-5p5zn"] Apr 16 13:20:23.883665 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:23.883636 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f22d56e1-1f03-4547-a594-9ae793ac47c4" path="/var/lib/kubelet/pods/f22d56e1-1f03-4547-a594-9ae793ac47c4/volumes" Apr 16 13:20:41.112364 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.112329 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gbrfd"] Apr 16 13:20:41.112801 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.112702 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f22d56e1-1f03-4547-a594-9ae793ac47c4" containerName="console" Apr 16 13:20:41.112801 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.112715 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f22d56e1-1f03-4547-a594-9ae793ac47c4" containerName="console" Apr 16 13:20:41.112801 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.112775 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f22d56e1-1f03-4547-a594-9ae793ac47c4" containerName="console" Apr 16 13:20:41.136434 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.136404 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gbrfd"] Apr 16 13:20:41.136645 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.136534 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" Apr 16 13:20:41.139402 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.139368 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-8qrtv\"" Apr 16 13:20:41.238654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.238611 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hkps\" (UniqueName: \"kubernetes.io/projected/6ba144a0-370c-46ef-9442-f58ef88e95bc-kube-api-access-4hkps\") pod \"authorino-f99f4b5cd-gbrfd\" (UID: \"6ba144a0-370c-46ef-9442-f58ef88e95bc\") " pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" Apr 16 13:20:41.274519 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.274485 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-nwxjp"] Apr 16 13:20:41.277806 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.277789 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-nwxjp" Apr 16 13:20:41.284219 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.284185 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-nwxjp"] Apr 16 13:20:41.339194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.339162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57j9z\" (UniqueName: \"kubernetes.io/projected/81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af-kube-api-access-57j9z\") pod \"authorino-7498df8756-nwxjp\" (UID: \"81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af\") " pod="kuadrant-system/authorino-7498df8756-nwxjp" Apr 16 13:20:41.339391 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.339218 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hkps\" (UniqueName: \"kubernetes.io/projected/6ba144a0-370c-46ef-9442-f58ef88e95bc-kube-api-access-4hkps\") pod \"authorino-f99f4b5cd-gbrfd\" (UID: \"6ba144a0-370c-46ef-9442-f58ef88e95bc\") " pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" Apr 16 13:20:41.347405 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.347371 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hkps\" (UniqueName: \"kubernetes.io/projected/6ba144a0-370c-46ef-9442-f58ef88e95bc-kube-api-access-4hkps\") pod \"authorino-f99f4b5cd-gbrfd\" (UID: \"6ba144a0-370c-46ef-9442-f58ef88e95bc\") " pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" Apr 16 13:20:41.440602 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.440569 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57j9z\" (UniqueName: \"kubernetes.io/projected/81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af-kube-api-access-57j9z\") pod \"authorino-7498df8756-nwxjp\" (UID: \"81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af\") " pod="kuadrant-system/authorino-7498df8756-nwxjp" Apr 16 13:20:41.446480 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.446425 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" Apr 16 13:20:41.449814 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.449793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57j9z\" (UniqueName: \"kubernetes.io/projected/81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af-kube-api-access-57j9z\") pod \"authorino-7498df8756-nwxjp\" (UID: \"81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af\") " pod="kuadrant-system/authorino-7498df8756-nwxjp" Apr 16 13:20:41.570801 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.570777 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gbrfd"] Apr 16 13:20:41.588051 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.588023 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-nwxjp" Apr 16 13:20:41.709082 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.709055 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-nwxjp"] Apr 16 13:20:41.710096 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:20:41.710062 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81ddeba2_eb4c_4788_9ce1_6bf0d7fcf1af.slice/crio-4212329e209bea46ef83aaa7f8610be74a7ff6340563984f08d95a67243b1ee8 WatchSource:0}: Error finding container 4212329e209bea46ef83aaa7f8610be74a7ff6340563984f08d95a67243b1ee8: Status 404 returned error can't find the container with id 4212329e209bea46ef83aaa7f8610be74a7ff6340563984f08d95a67243b1ee8 Apr 16 13:20:41.899261 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.899227 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-nwxjp" event={"ID":"81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af","Type":"ContainerStarted","Data":"4212329e209bea46ef83aaa7f8610be74a7ff6340563984f08d95a67243b1ee8"} Apr 16 13:20:41.900168 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:41.900143 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" event={"ID":"6ba144a0-370c-46ef-9442-f58ef88e95bc","Type":"ContainerStarted","Data":"4a8206c23b55dc7303b9631b98d5bae490a44016b83b2120da18b7d0966ddbd3"} Apr 16 13:20:44.914781 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:44.914739 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-nwxjp" event={"ID":"81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af","Type":"ContainerStarted","Data":"ecb15b9f901ef8477e03b2babc82f5e23f55a0ee30c78d456d7d340622896d51"} Apr 16 13:20:44.916044 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:44.916020 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" event={"ID":"6ba144a0-370c-46ef-9442-f58ef88e95bc","Type":"ContainerStarted","Data":"45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71"} Apr 16 13:20:44.932465 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:44.932413 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-nwxjp" podStartSLOduration=1.593418974 podStartE2EDuration="3.932399163s" podCreationTimestamp="2026-04-16 13:20:41 +0000 UTC" firstStartedPulling="2026-04-16 13:20:41.711329216 +0000 UTC m=+544.401261009" lastFinishedPulling="2026-04-16 13:20:44.050309402 +0000 UTC m=+546.740241198" observedRunningTime="2026-04-16 13:20:44.930635128 +0000 UTC m=+547.620566943" watchObservedRunningTime="2026-04-16 13:20:44.932399163 +0000 UTC m=+547.622330978" Apr 16 13:20:44.947595 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:44.947550 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" podStartSLOduration=1.485442747 podStartE2EDuration="3.947536638s" podCreationTimestamp="2026-04-16 13:20:41 +0000 UTC" firstStartedPulling="2026-04-16 13:20:41.575858685 +0000 UTC m=+544.265790478" lastFinishedPulling="2026-04-16 13:20:44.037952571 +0000 UTC m=+546.727884369" observedRunningTime="2026-04-16 13:20:44.944328397 +0000 UTC m=+547.634260214" watchObservedRunningTime="2026-04-16 13:20:44.947536638 +0000 UTC m=+547.637468458" Apr 16 13:20:44.972172 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:44.972140 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gbrfd"] Apr 16 13:20:46.923109 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:46.923070 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" podUID="6ba144a0-370c-46ef-9442-f58ef88e95bc" containerName="authorino" containerID="cri-o://45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71" gracePeriod=30 Apr 16 13:20:47.168953 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.168928 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" Apr 16 13:20:47.193489 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.193400 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hkps\" (UniqueName: \"kubernetes.io/projected/6ba144a0-370c-46ef-9442-f58ef88e95bc-kube-api-access-4hkps\") pod \"6ba144a0-370c-46ef-9442-f58ef88e95bc\" (UID: \"6ba144a0-370c-46ef-9442-f58ef88e95bc\") " Apr 16 13:20:47.195529 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.195498 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba144a0-370c-46ef-9442-f58ef88e95bc-kube-api-access-4hkps" (OuterVolumeSpecName: "kube-api-access-4hkps") pod "6ba144a0-370c-46ef-9442-f58ef88e95bc" (UID: "6ba144a0-370c-46ef-9442-f58ef88e95bc"). InnerVolumeSpecName "kube-api-access-4hkps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:20:47.294847 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.294795 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4hkps\" (UniqueName: \"kubernetes.io/projected/6ba144a0-370c-46ef-9442-f58ef88e95bc-kube-api-access-4hkps\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:20:47.927438 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.927397 2574 generic.go:358] "Generic (PLEG): container finished" podID="6ba144a0-370c-46ef-9442-f58ef88e95bc" containerID="45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71" exitCode=0 Apr 16 13:20:47.927860 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.927480 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" Apr 16 13:20:47.927860 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.927490 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" event={"ID":"6ba144a0-370c-46ef-9442-f58ef88e95bc","Type":"ContainerDied","Data":"45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71"} Apr 16 13:20:47.927860 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.927528 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-gbrfd" event={"ID":"6ba144a0-370c-46ef-9442-f58ef88e95bc","Type":"ContainerDied","Data":"4a8206c23b55dc7303b9631b98d5bae490a44016b83b2120da18b7d0966ddbd3"} Apr 16 13:20:47.927860 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.927543 2574 scope.go:117] "RemoveContainer" containerID="45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71" Apr 16 13:20:47.938071 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.938021 2574 scope.go:117] "RemoveContainer" containerID="45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71" Apr 16 13:20:47.938254 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:20:47.938233 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71\": container with ID starting with 45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71 not found: ID does not exist" containerID="45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71" Apr 16 13:20:47.938364 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.938261 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71"} err="failed to get container status \"45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71\": rpc error: code = NotFound desc = could not find container \"45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71\": container with ID starting with 45718d92c5a98eab7812ef5756b57a67b52049697130fa3001f34f4fdc0e4d71 not found: ID does not exist" Apr 16 13:20:47.947864 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.947839 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gbrfd"] Apr 16 13:20:47.955412 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:47.954065 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-gbrfd"] Apr 16 13:20:49.883636 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:20:49.883607 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba144a0-370c-46ef-9442-f58ef88e95bc" path="/var/lib/kubelet/pods/6ba144a0-370c-46ef-9442-f58ef88e95bc/volumes" Apr 16 13:21:15.229850 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.229815 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m"] Apr 16 13:21:15.230285 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.230215 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ba144a0-370c-46ef-9442-f58ef88e95bc" containerName="authorino" Apr 16 13:21:15.230285 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.230230 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba144a0-370c-46ef-9442-f58ef88e95bc" containerName="authorino" Apr 16 13:21:15.230363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.230297 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ba144a0-370c-46ef-9442-f58ef88e95bc" containerName="authorino" Apr 16 13:21:15.244835 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.244797 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m"] Apr 16 13:21:15.245015 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.244920 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.248140 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.248113 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 13:21:15.248265 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.248113 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 13:21:15.249509 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.249482 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-682jg\"" Apr 16 13:21:15.339054 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.339021 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.339246 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.339070 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.339246 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.339170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klzgd\" (UniqueName: \"kubernetes.io/projected/4ceb0fe3-56e2-443d-86b0-bebbf1666433-kube-api-access-klzgd\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.440061 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.440022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klzgd\" (UniqueName: \"kubernetes.io/projected/4ceb0fe3-56e2-443d-86b0-bebbf1666433-kube-api-access-klzgd\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.440263 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.440112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.440263 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.440157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.440530 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.440508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.440619 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.440567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.456800 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.456770 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klzgd\" (UniqueName: \"kubernetes.io/projected/4ceb0fe3-56e2-443d-86b0-bebbf1666433-kube-api-access-klzgd\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.554995 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.554875 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:15.680628 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:15.680596 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m"] Apr 16 13:21:15.681741 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:21:15.681712 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ceb0fe3_56e2_443d_86b0_bebbf1666433.slice/crio-ce3008e99c161c29a866fb4459e2ede2bf79afa53a94ed930004eb7f6a00f87a WatchSource:0}: Error finding container ce3008e99c161c29a866fb4459e2ede2bf79afa53a94ed930004eb7f6a00f87a: Status 404 returned error can't find the container with id ce3008e99c161c29a866fb4459e2ede2bf79afa53a94ed930004eb7f6a00f87a Apr 16 13:21:16.036763 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:16.036730 2574 generic.go:358] "Generic (PLEG): container finished" podID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerID="07fdee394038498e8fd6d10ddadcfceacedc6a88a891e2ba4743635a9b9013b6" exitCode=0 Apr 16 13:21:16.036940 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:16.036820 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" event={"ID":"4ceb0fe3-56e2-443d-86b0-bebbf1666433","Type":"ContainerDied","Data":"07fdee394038498e8fd6d10ddadcfceacedc6a88a891e2ba4743635a9b9013b6"} Apr 16 13:21:16.036940 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:16.036856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" event={"ID":"4ceb0fe3-56e2-443d-86b0-bebbf1666433","Type":"ContainerStarted","Data":"ce3008e99c161c29a866fb4459e2ede2bf79afa53a94ed930004eb7f6a00f87a"} Apr 16 13:21:17.041884 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:17.041798 2574 generic.go:358] "Generic (PLEG): container finished" podID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerID="e1e638285cdb64787335fc651affa3fe5889e7f0cd7e3325e6741fdb9688bdeb" exitCode=0 Apr 16 13:21:17.042239 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:17.041924 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" event={"ID":"4ceb0fe3-56e2-443d-86b0-bebbf1666433","Type":"ContainerDied","Data":"e1e638285cdb64787335fc651affa3fe5889e7f0cd7e3325e6741fdb9688bdeb"} Apr 16 13:21:18.047870 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:18.047833 2574 generic.go:358] "Generic (PLEG): container finished" podID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerID="576954b3f3a02c1ec74b9ac41e8777227988ea412f616ee0f3fdfcad3adb16d4" exitCode=0 Apr 16 13:21:18.048241 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:18.047921 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" event={"ID":"4ceb0fe3-56e2-443d-86b0-bebbf1666433","Type":"ContainerDied","Data":"576954b3f3a02c1ec74b9ac41e8777227988ea412f616ee0f3fdfcad3adb16d4"} Apr 16 13:21:19.184345 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.184322 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:19.376635 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.376533 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klzgd\" (UniqueName: \"kubernetes.io/projected/4ceb0fe3-56e2-443d-86b0-bebbf1666433-kube-api-access-klzgd\") pod \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " Apr 16 13:21:19.376635 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.376581 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-util\") pod \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " Apr 16 13:21:19.376874 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.376684 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-bundle\") pod \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\" (UID: \"4ceb0fe3-56e2-443d-86b0-bebbf1666433\") " Apr 16 13:21:19.377240 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.377212 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-bundle" (OuterVolumeSpecName: "bundle") pod "4ceb0fe3-56e2-443d-86b0-bebbf1666433" (UID: "4ceb0fe3-56e2-443d-86b0-bebbf1666433"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:21:19.378685 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.378664 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ceb0fe3-56e2-443d-86b0-bebbf1666433-kube-api-access-klzgd" (OuterVolumeSpecName: "kube-api-access-klzgd") pod "4ceb0fe3-56e2-443d-86b0-bebbf1666433" (UID: "4ceb0fe3-56e2-443d-86b0-bebbf1666433"). InnerVolumeSpecName "kube-api-access-klzgd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:21:19.382943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.382910 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-util" (OuterVolumeSpecName: "util") pod "4ceb0fe3-56e2-443d-86b0-bebbf1666433" (UID: "4ceb0fe3-56e2-443d-86b0-bebbf1666433"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:21:19.477408 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.477377 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-klzgd\" (UniqueName: \"kubernetes.io/projected/4ceb0fe3-56e2-443d-86b0-bebbf1666433-kube-api-access-klzgd\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:21:19.477408 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.477405 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-util\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:21:19.477408 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:19.477415 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ceb0fe3-56e2-443d-86b0-bebbf1666433-bundle\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:21:20.057391 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:20.057358 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" Apr 16 13:21:20.057391 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:20.057374 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae13504gq2m" event={"ID":"4ceb0fe3-56e2-443d-86b0-bebbf1666433","Type":"ContainerDied","Data":"ce3008e99c161c29a866fb4459e2ede2bf79afa53a94ed930004eb7f6a00f87a"} Apr 16 13:21:20.057646 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:20.057407 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce3008e99c161c29a866fb4459e2ede2bf79afa53a94ed930004eb7f6a00f87a" Apr 16 13:21:37.803563 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:37.803533 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:21:37.805571 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:37.805549 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:21:37.807278 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:37.807261 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:21:37.808939 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:37.808922 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:21:38.091398 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.091316 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 13:21:38.091713 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.091700 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerName="extract" Apr 16 13:21:38.091756 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.091715 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerName="extract" Apr 16 13:21:38.091756 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.091730 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerName="util" Apr 16 13:21:38.091756 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.091735 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerName="util" Apr 16 13:21:38.091756 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.091744 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerName="pull" Apr 16 13:21:38.091756 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.091749 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerName="pull" Apr 16 13:21:38.091909 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.091804 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ceb0fe3-56e2-443d-86b0-bebbf1666433" containerName="extract" Apr 16 13:21:38.096032 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.096013 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 13:21:38.099111 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.099084 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 16 13:21:38.099368 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.099114 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 16 13:21:38.099520 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.099114 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 16 13:21:38.100525 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.100502 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-l7jss\"" Apr 16 13:21:38.104406 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.104384 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 13:21:38.147761 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.147725 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m9tp\" (UniqueName: \"kubernetes.io/projected/aa6163cd-bbd1-4f1a-a7c7-7c5474840715-kube-api-access-7m9tp\") pod \"maas-keycloak-0\" (UID: \"aa6163cd-bbd1-4f1a-a7c7-7c5474840715\") " pod="keycloak-system/maas-keycloak-0" Apr 16 13:21:38.248413 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.248368 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7m9tp\" (UniqueName: \"kubernetes.io/projected/aa6163cd-bbd1-4f1a-a7c7-7c5474840715-kube-api-access-7m9tp\") pod \"maas-keycloak-0\" (UID: \"aa6163cd-bbd1-4f1a-a7c7-7c5474840715\") " pod="keycloak-system/maas-keycloak-0" Apr 16 13:21:38.256248 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.256223 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m9tp\" (UniqueName: \"kubernetes.io/projected/aa6163cd-bbd1-4f1a-a7c7-7c5474840715-kube-api-access-7m9tp\") pod \"maas-keycloak-0\" (UID: \"aa6163cd-bbd1-4f1a-a7c7-7c5474840715\") " pod="keycloak-system/maas-keycloak-0" Apr 16 13:21:38.407329 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.407236 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 13:21:38.539194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:38.539164 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 13:21:38.540374 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:21:38.540352 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa6163cd_bbd1_4f1a_a7c7_7c5474840715.slice/crio-ab7230b1206c92f67ae4c791cb85a9af75a976b2b56d67381d8233f06013b709 WatchSource:0}: Error finding container ab7230b1206c92f67ae4c791cb85a9af75a976b2b56d67381d8233f06013b709: Status 404 returned error can't find the container with id ab7230b1206c92f67ae4c791cb85a9af75a976b2b56d67381d8233f06013b709 Apr 16 13:21:39.142362 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:39.142319 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"aa6163cd-bbd1-4f1a-a7c7-7c5474840715","Type":"ContainerStarted","Data":"ab7230b1206c92f67ae4c791cb85a9af75a976b2b56d67381d8233f06013b709"} Apr 16 13:21:44.165610 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:44.165569 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"aa6163cd-bbd1-4f1a-a7c7-7c5474840715","Type":"ContainerStarted","Data":"c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2"} Apr 16 13:21:44.185772 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:44.185720 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.4766229929999999 podStartE2EDuration="6.185703171s" podCreationTimestamp="2026-04-16 13:21:38 +0000 UTC" firstStartedPulling="2026-04-16 13:21:38.541720676 +0000 UTC m=+601.231652468" lastFinishedPulling="2026-04-16 13:21:43.250800852 +0000 UTC m=+605.940732646" observedRunningTime="2026-04-16 13:21:44.183878881 +0000 UTC m=+606.873810700" watchObservedRunningTime="2026-04-16 13:21:44.185703171 +0000 UTC m=+606.875634986" Apr 16 13:21:44.407877 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:44.407829 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 16 13:21:44.409760 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:44.409718 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:45.408485 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:45.408405 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:46.408133 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:46.408067 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:47.408286 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:47.408239 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:48.408528 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:48.408424 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 16 13:21:48.409125 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:48.408821 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:49.407897 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:49.407840 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:50.407760 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:50.407713 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:51.408819 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:51.408765 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:52.407676 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:52.407628 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:53.408076 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:53.408025 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:54.408281 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:54.408237 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:55.408388 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:55.408343 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.48:9000/health/started\": dial tcp 10.134.0.48:9000: connect: connection refused" Apr 16 13:21:56.526534 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:56.526484 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 16 13:21:56.546116 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:21:56.546071 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 13:22:06.532642 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:06.532596 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:07.766242 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.766203 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5c9fb6b747-v6jg6"] Apr 16 13:22:07.777555 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.777532 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:07.780321 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.780297 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 13:22:07.780586 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.780566 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 13:22:07.781241 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.781224 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-dkfvz\"" Apr 16 13:22:07.783259 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.783237 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5c9fb6b747-v6jg6"] Apr 16 13:22:07.786514 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.786250 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-f7984c5cd-nl8tl"] Apr 16 13:22:07.789838 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.789821 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" Apr 16 13:22:07.792299 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.792281 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-qd7v7\"" Apr 16 13:22:07.799398 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.799375 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f7984c5cd-nl8tl"] Apr 16 13:22:07.832260 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.832231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/26e38b2e-b3d0-4542-8878-d6c198983a75-maas-api-tls\") pod \"maas-api-5c9fb6b747-v6jg6\" (UID: \"26e38b2e-b3d0-4542-8878-d6c198983a75\") " pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:07.832416 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.832315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzrqb\" (UniqueName: \"kubernetes.io/projected/3db4cc7d-8b99-4697-b428-c846f7e3a3ab-kube-api-access-lzrqb\") pod \"maas-controller-f7984c5cd-nl8tl\" (UID: \"3db4cc7d-8b99-4697-b428-c846f7e3a3ab\") " pod="opendatahub/maas-controller-f7984c5cd-nl8tl" Apr 16 13:22:07.832416 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.832349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldtdn\" (UniqueName: \"kubernetes.io/projected/26e38b2e-b3d0-4542-8878-d6c198983a75-kube-api-access-ldtdn\") pod \"maas-api-5c9fb6b747-v6jg6\" (UID: \"26e38b2e-b3d0-4542-8878-d6c198983a75\") " pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:07.932794 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.932758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzrqb\" (UniqueName: \"kubernetes.io/projected/3db4cc7d-8b99-4697-b428-c846f7e3a3ab-kube-api-access-lzrqb\") pod \"maas-controller-f7984c5cd-nl8tl\" (UID: \"3db4cc7d-8b99-4697-b428-c846f7e3a3ab\") " pod="opendatahub/maas-controller-f7984c5cd-nl8tl" Apr 16 13:22:07.932968 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.932804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldtdn\" (UniqueName: \"kubernetes.io/projected/26e38b2e-b3d0-4542-8878-d6c198983a75-kube-api-access-ldtdn\") pod \"maas-api-5c9fb6b747-v6jg6\" (UID: \"26e38b2e-b3d0-4542-8878-d6c198983a75\") " pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:07.932968 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.932880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/26e38b2e-b3d0-4542-8878-d6c198983a75-maas-api-tls\") pod \"maas-api-5c9fb6b747-v6jg6\" (UID: \"26e38b2e-b3d0-4542-8878-d6c198983a75\") " pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:07.935293 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.935270 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/26e38b2e-b3d0-4542-8878-d6c198983a75-maas-api-tls\") pod \"maas-api-5c9fb6b747-v6jg6\" (UID: \"26e38b2e-b3d0-4542-8878-d6c198983a75\") " pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:07.949966 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.949939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzrqb\" (UniqueName: \"kubernetes.io/projected/3db4cc7d-8b99-4697-b428-c846f7e3a3ab-kube-api-access-lzrqb\") pod \"maas-controller-f7984c5cd-nl8tl\" (UID: \"3db4cc7d-8b99-4697-b428-c846f7e3a3ab\") " pod="opendatahub/maas-controller-f7984c5cd-nl8tl" Apr 16 13:22:07.950928 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:07.950907 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldtdn\" (UniqueName: \"kubernetes.io/projected/26e38b2e-b3d0-4542-8878-d6c198983a75-kube-api-access-ldtdn\") pod \"maas-api-5c9fb6b747-v6jg6\" (UID: \"26e38b2e-b3d0-4542-8878-d6c198983a75\") " pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:08.090127 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.090027 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:08.100937 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.100903 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" Apr 16 13:22:08.258236 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.258192 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-f7984c5cd-nl8tl"] Apr 16 13:22:08.263614 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:22:08.263580 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db4cc7d_8b99_4697_b428_c846f7e3a3ab.slice/crio-f49cb763a19a942e644a48570391fe927ed2766f26cf5039b5a548eb98d52a7e WatchSource:0}: Error finding container f49cb763a19a942e644a48570391fe927ed2766f26cf5039b5a548eb98d52a7e: Status 404 returned error can't find the container with id f49cb763a19a942e644a48570391fe927ed2766f26cf5039b5a548eb98d52a7e Apr 16 13:22:08.268349 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.268327 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:22:08.285166 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.285131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" event={"ID":"3db4cc7d-8b99-4697-b428-c846f7e3a3ab","Type":"ContainerStarted","Data":"f49cb763a19a942e644a48570391fe927ed2766f26cf5039b5a548eb98d52a7e"} Apr 16 13:22:08.436964 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.436934 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5c9fb6b747-v6jg6"] Apr 16 13:22:08.437871 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:22:08.437842 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26e38b2e_b3d0_4542_8878_d6c198983a75.slice/crio-448f07c86e6880c24b29f7d625ea17d5d1e22c0855c8e88f6ea4ed9d4d83c7b4 WatchSource:0}: Error finding container 448f07c86e6880c24b29f7d625ea17d5d1e22c0855c8e88f6ea4ed9d4d83c7b4: Status 404 returned error can't find the container with id 448f07c86e6880c24b29f7d625ea17d5d1e22c0855c8e88f6ea4ed9d4d83c7b4 Apr 16 13:22:08.485082 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.485046 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-n5l2s"] Apr 16 13:22:08.490079 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.490060 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" Apr 16 13:22:08.495366 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.495342 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-n5l2s"] Apr 16 13:22:08.539201 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.539170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bl6j\" (UniqueName: \"kubernetes.io/projected/d80e187a-1580-4cea-9342-c0673108c17d-kube-api-access-5bl6j\") pod \"authorino-8b475cf9f-n5l2s\" (UID: \"d80e187a-1580-4cea-9342-c0673108c17d\") " pod="kuadrant-system/authorino-8b475cf9f-n5l2s" Apr 16 13:22:08.640065 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.640032 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bl6j\" (UniqueName: \"kubernetes.io/projected/d80e187a-1580-4cea-9342-c0673108c17d-kube-api-access-5bl6j\") pod \"authorino-8b475cf9f-n5l2s\" (UID: \"d80e187a-1580-4cea-9342-c0673108c17d\") " pod="kuadrant-system/authorino-8b475cf9f-n5l2s" Apr 16 13:22:08.649720 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.649683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bl6j\" (UniqueName: \"kubernetes.io/projected/d80e187a-1580-4cea-9342-c0673108c17d-kube-api-access-5bl6j\") pod \"authorino-8b475cf9f-n5l2s\" (UID: \"d80e187a-1580-4cea-9342-c0673108c17d\") " pod="kuadrant-system/authorino-8b475cf9f-n5l2s" Apr 16 13:22:08.713886 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.713800 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-n5l2s"] Apr 16 13:22:08.714067 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.714055 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" Apr 16 13:22:08.741525 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.741489 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-75579987c7-t7bjb"] Apr 16 13:22:08.746960 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.746932 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-75579987c7-t7bjb" Apr 16 13:22:08.749547 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.749478 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-75579987c7-t7bjb"] Apr 16 13:22:08.841077 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.841040 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jx7h\" (UniqueName: \"kubernetes.io/projected/4e531371-eccc-4b4d-8cea-b193f231a829-kube-api-access-5jx7h\") pod \"authorino-75579987c7-t7bjb\" (UID: \"4e531371-eccc-4b4d-8cea-b193f231a829\") " pod="kuadrant-system/authorino-75579987c7-t7bjb" Apr 16 13:22:08.843499 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.843447 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-75579987c7-t7bjb"] Apr 16 13:22:08.843724 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:22:08.843702 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5jx7h], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-75579987c7-t7bjb" podUID="4e531371-eccc-4b4d-8cea-b193f231a829" Apr 16 13:22:08.848123 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.848102 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-n5l2s"] Apr 16 13:22:08.849373 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:22:08.849347 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd80e187a_1580_4cea_9342_c0673108c17d.slice/crio-409385240b1add4b484aa19a91423dd9ffec6e27f17bf11e27e049b7cd17fcd9 WatchSource:0}: Error finding container 409385240b1add4b484aa19a91423dd9ffec6e27f17bf11e27e049b7cd17fcd9: Status 404 returned error can't find the container with id 409385240b1add4b484aa19a91423dd9ffec6e27f17bf11e27e049b7cd17fcd9 Apr 16 13:22:08.869556 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.869525 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-57b4dcbf6f-zncdd"] Apr 16 13:22:08.873251 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.873230 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:22:08.876334 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.876306 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 13:22:08.879754 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.879728 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-57b4dcbf6f-zncdd"] Apr 16 13:22:08.941910 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.941869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jx7h\" (UniqueName: \"kubernetes.io/projected/4e531371-eccc-4b4d-8cea-b193f231a829-kube-api-access-5jx7h\") pod \"authorino-75579987c7-t7bjb\" (UID: \"4e531371-eccc-4b4d-8cea-b193f231a829\") " pod="kuadrant-system/authorino-75579987c7-t7bjb" Apr 16 13:22:08.942100 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.941980 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/85ff623e-2399-460d-80ad-8b7f75994436-tls-cert\") pod \"authorino-57b4dcbf6f-zncdd\" (UID: \"85ff623e-2399-460d-80ad-8b7f75994436\") " pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:22:08.942163 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.942111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5qn7\" (UniqueName: \"kubernetes.io/projected/85ff623e-2399-460d-80ad-8b7f75994436-kube-api-access-j5qn7\") pod \"authorino-57b4dcbf6f-zncdd\" (UID: \"85ff623e-2399-460d-80ad-8b7f75994436\") " pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:22:08.955199 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:08.955168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jx7h\" (UniqueName: \"kubernetes.io/projected/4e531371-eccc-4b4d-8cea-b193f231a829-kube-api-access-5jx7h\") pod \"authorino-75579987c7-t7bjb\" (UID: \"4e531371-eccc-4b4d-8cea-b193f231a829\") " pod="kuadrant-system/authorino-75579987c7-t7bjb" Apr 16 13:22:09.042962 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.042880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5qn7\" (UniqueName: \"kubernetes.io/projected/85ff623e-2399-460d-80ad-8b7f75994436-kube-api-access-j5qn7\") pod \"authorino-57b4dcbf6f-zncdd\" (UID: \"85ff623e-2399-460d-80ad-8b7f75994436\") " pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:22:09.043131 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.043020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/85ff623e-2399-460d-80ad-8b7f75994436-tls-cert\") pod \"authorino-57b4dcbf6f-zncdd\" (UID: \"85ff623e-2399-460d-80ad-8b7f75994436\") " pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:22:09.046790 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.046732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/85ff623e-2399-460d-80ad-8b7f75994436-tls-cert\") pod \"authorino-57b4dcbf6f-zncdd\" (UID: \"85ff623e-2399-460d-80ad-8b7f75994436\") " pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:22:09.053829 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.053802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5qn7\" (UniqueName: \"kubernetes.io/projected/85ff623e-2399-460d-80ad-8b7f75994436-kube-api-access-j5qn7\") pod \"authorino-57b4dcbf6f-zncdd\" (UID: \"85ff623e-2399-460d-80ad-8b7f75994436\") " pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:22:09.185054 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.185016 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:22:09.294624 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.294498 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" event={"ID":"d80e187a-1580-4cea-9342-c0673108c17d","Type":"ContainerStarted","Data":"409385240b1add4b484aa19a91423dd9ffec6e27f17bf11e27e049b7cd17fcd9"} Apr 16 13:22:09.297817 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.296752 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-75579987c7-t7bjb" Apr 16 13:22:09.297817 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.297097 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" event={"ID":"26e38b2e-b3d0-4542-8878-d6c198983a75","Type":"ContainerStarted","Data":"448f07c86e6880c24b29f7d625ea17d5d1e22c0855c8e88f6ea4ed9d4d83c7b4"} Apr 16 13:22:09.306504 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.306172 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-75579987c7-t7bjb" Apr 16 13:22:09.347864 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.347824 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jx7h\" (UniqueName: \"kubernetes.io/projected/4e531371-eccc-4b4d-8cea-b193f231a829-kube-api-access-5jx7h\") pod \"4e531371-eccc-4b4d-8cea-b193f231a829\" (UID: \"4e531371-eccc-4b4d-8cea-b193f231a829\") " Apr 16 13:22:09.351902 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.351868 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e531371-eccc-4b4d-8cea-b193f231a829-kube-api-access-5jx7h" (OuterVolumeSpecName: "kube-api-access-5jx7h") pod "4e531371-eccc-4b4d-8cea-b193f231a829" (UID: "4e531371-eccc-4b4d-8cea-b193f231a829"). InnerVolumeSpecName "kube-api-access-5jx7h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:22:09.385447 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.385414 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-57b4dcbf6f-zncdd"] Apr 16 13:22:09.449129 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:09.449093 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5jx7h\" (UniqueName: \"kubernetes.io/projected/4e531371-eccc-4b4d-8cea-b193f231a829-kube-api-access-5jx7h\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:22:10.303266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.303216 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" event={"ID":"85ff623e-2399-460d-80ad-8b7f75994436","Type":"ContainerStarted","Data":"8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1"} Apr 16 13:22:10.303266 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.303271 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" event={"ID":"85ff623e-2399-460d-80ad-8b7f75994436","Type":"ContainerStarted","Data":"5190146b367ad0137c52d8bb6125a9abc56bd80ddbad157c3bddf16ea8a7b028"} Apr 16 13:22:10.306194 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.306162 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-75579987c7-t7bjb" Apr 16 13:22:10.306600 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.306566 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" event={"ID":"d80e187a-1580-4cea-9342-c0673108c17d","Type":"ContainerStarted","Data":"16a6065eaac98abcf168ff83a46a017da955af3bff7e49452f73989b81c5dada"} Apr 16 13:22:10.306734 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.306629 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" podUID="d80e187a-1580-4cea-9342-c0673108c17d" containerName="authorino" containerID="cri-o://16a6065eaac98abcf168ff83a46a017da955af3bff7e49452f73989b81c5dada" gracePeriod=30 Apr 16 13:22:10.321996 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.321942 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" podStartSLOduration=1.96006828 podStartE2EDuration="2.321925138s" podCreationTimestamp="2026-04-16 13:22:08 +0000 UTC" firstStartedPulling="2026-04-16 13:22:09.387196229 +0000 UTC m=+632.077128026" lastFinishedPulling="2026-04-16 13:22:09.749053081 +0000 UTC m=+632.438984884" observedRunningTime="2026-04-16 13:22:10.31823449 +0000 UTC m=+633.008166309" watchObservedRunningTime="2026-04-16 13:22:10.321925138 +0000 UTC m=+633.011856984" Apr 16 13:22:10.349289 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.349253 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-nwxjp"] Apr 16 13:22:10.349579 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.349553 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-nwxjp" podUID="81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af" containerName="authorino" containerID="cri-o://ecb15b9f901ef8477e03b2babc82f5e23f55a0ee30c78d456d7d340622896d51" gracePeriod=30 Apr 16 13:22:10.357201 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.357171 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-75579987c7-t7bjb"] Apr 16 13:22:10.361117 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.361094 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-75579987c7-t7bjb"] Apr 16 13:22:10.377445 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:10.376941 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" podStartSLOduration=1.956078562 podStartE2EDuration="2.376922492s" podCreationTimestamp="2026-04-16 13:22:08 +0000 UTC" firstStartedPulling="2026-04-16 13:22:08.850815743 +0000 UTC m=+631.540747540" lastFinishedPulling="2026-04-16 13:22:09.271659673 +0000 UTC m=+631.961591470" observedRunningTime="2026-04-16 13:22:10.37415905 +0000 UTC m=+633.064090870" watchObservedRunningTime="2026-04-16 13:22:10.376922492 +0000 UTC m=+633.066854308" Apr 16 13:22:11.313854 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.313821 2574 generic.go:358] "Generic (PLEG): container finished" podID="81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af" containerID="ecb15b9f901ef8477e03b2babc82f5e23f55a0ee30c78d456d7d340622896d51" exitCode=0 Apr 16 13:22:11.314303 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.313902 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-nwxjp" event={"ID":"81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af","Type":"ContainerDied","Data":"ecb15b9f901ef8477e03b2babc82f5e23f55a0ee30c78d456d7d340622896d51"} Apr 16 13:22:11.315585 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.315558 2574 generic.go:358] "Generic (PLEG): container finished" podID="d80e187a-1580-4cea-9342-c0673108c17d" containerID="16a6065eaac98abcf168ff83a46a017da955af3bff7e49452f73989b81c5dada" exitCode=0 Apr 16 13:22:11.315720 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.315634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" event={"ID":"d80e187a-1580-4cea-9342-c0673108c17d","Type":"ContainerDied","Data":"16a6065eaac98abcf168ff83a46a017da955af3bff7e49452f73989b81c5dada"} Apr 16 13:22:11.413793 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.413766 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-nwxjp" Apr 16 13:22:11.418141 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.418125 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" Apr 16 13:22:11.468011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.467983 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bl6j\" (UniqueName: \"kubernetes.io/projected/d80e187a-1580-4cea-9342-c0673108c17d-kube-api-access-5bl6j\") pod \"d80e187a-1580-4cea-9342-c0673108c17d\" (UID: \"d80e187a-1580-4cea-9342-c0673108c17d\") " Apr 16 13:22:11.468164 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.468041 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57j9z\" (UniqueName: \"kubernetes.io/projected/81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af-kube-api-access-57j9z\") pod \"81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af\" (UID: \"81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af\") " Apr 16 13:22:11.470624 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.470551 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af-kube-api-access-57j9z" (OuterVolumeSpecName: "kube-api-access-57j9z") pod "81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af" (UID: "81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af"). InnerVolumeSpecName "kube-api-access-57j9z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:22:11.470859 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.470831 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d80e187a-1580-4cea-9342-c0673108c17d-kube-api-access-5bl6j" (OuterVolumeSpecName: "kube-api-access-5bl6j") pod "d80e187a-1580-4cea-9342-c0673108c17d" (UID: "d80e187a-1580-4cea-9342-c0673108c17d"). InnerVolumeSpecName "kube-api-access-5bl6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:22:11.569880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.569784 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bl6j\" (UniqueName: \"kubernetes.io/projected/d80e187a-1580-4cea-9342-c0673108c17d-kube-api-access-5bl6j\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:22:11.569880 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.569820 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57j9z\" (UniqueName: \"kubernetes.io/projected/81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af-kube-api-access-57j9z\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:22:11.884736 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:11.884647 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e531371-eccc-4b4d-8cea-b193f231a829" path="/var/lib/kubelet/pods/4e531371-eccc-4b4d-8cea-b193f231a829/volumes" Apr 16 13:22:12.320556 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.320520 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" event={"ID":"3db4cc7d-8b99-4697-b428-c846f7e3a3ab","Type":"ContainerStarted","Data":"1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d"} Apr 16 13:22:12.321106 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.320632 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" Apr 16 13:22:12.321699 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.321683 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-nwxjp" Apr 16 13:22:12.321812 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.321701 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-nwxjp" event={"ID":"81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af","Type":"ContainerDied","Data":"4212329e209bea46ef83aaa7f8610be74a7ff6340563984f08d95a67243b1ee8"} Apr 16 13:22:12.321812 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.321733 2574 scope.go:117] "RemoveContainer" containerID="ecb15b9f901ef8477e03b2babc82f5e23f55a0ee30c78d456d7d340622896d51" Apr 16 13:22:12.322926 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.322912 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" Apr 16 13:22:12.323028 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.322943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-n5l2s" event={"ID":"d80e187a-1580-4cea-9342-c0673108c17d","Type":"ContainerDied","Data":"409385240b1add4b484aa19a91423dd9ffec6e27f17bf11e27e049b7cd17fcd9"} Apr 16 13:22:12.324332 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.324310 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" event={"ID":"26e38b2e-b3d0-4542-8878-d6c198983a75","Type":"ContainerStarted","Data":"dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc"} Apr 16 13:22:12.324489 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.324449 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:12.331377 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.331360 2574 scope.go:117] "RemoveContainer" containerID="16a6065eaac98abcf168ff83a46a017da955af3bff7e49452f73989b81c5dada" Apr 16 13:22:12.342497 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.342434 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" podStartSLOduration=2.197360473 podStartE2EDuration="5.342420433s" podCreationTimestamp="2026-04-16 13:22:07 +0000 UTC" firstStartedPulling="2026-04-16 13:22:08.268613527 +0000 UTC m=+630.958545331" lastFinishedPulling="2026-04-16 13:22:11.413673498 +0000 UTC m=+634.103605291" observedRunningTime="2026-04-16 13:22:12.34016331 +0000 UTC m=+635.030095126" watchObservedRunningTime="2026-04-16 13:22:12.342420433 +0000 UTC m=+635.032352250" Apr 16 13:22:12.353498 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.353448 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-nwxjp"] Apr 16 13:22:12.358992 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.358970 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-nwxjp"] Apr 16 13:22:12.378175 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.378143 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-n5l2s"] Apr 16 13:22:12.382624 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.382574 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-n5l2s"] Apr 16 13:22:12.393328 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:12.393281 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" podStartSLOduration=2.415749754 podStartE2EDuration="5.393262599s" podCreationTimestamp="2026-04-16 13:22:07 +0000 UTC" firstStartedPulling="2026-04-16 13:22:08.439276144 +0000 UTC m=+631.129207942" lastFinishedPulling="2026-04-16 13:22:11.416788991 +0000 UTC m=+634.106720787" observedRunningTime="2026-04-16 13:22:12.391395437 +0000 UTC m=+635.081327276" watchObservedRunningTime="2026-04-16 13:22:12.393262599 +0000 UTC m=+635.083194417" Apr 16 13:22:13.883068 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:13.883037 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af" path="/var/lib/kubelet/pods/81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af/volumes" Apr 16 13:22:13.883450 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:13.883360 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d80e187a-1580-4cea-9342-c0673108c17d" path="/var/lib/kubelet/pods/d80e187a-1580-4cea-9342-c0673108c17d/volumes" Apr 16 13:22:18.043824 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.043794 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-5c9fb6b747-v6jg6"] Apr 16 13:22:18.044246 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.043998 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" podUID="26e38b2e-b3d0-4542-8878-d6c198983a75" containerName="maas-api" containerID="cri-o://dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc" gracePeriod=30 Apr 16 13:22:18.048914 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.048893 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:18.285210 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.285185 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:18.331069 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.330977 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/26e38b2e-b3d0-4542-8878-d6c198983a75-maas-api-tls\") pod \"26e38b2e-b3d0-4542-8878-d6c198983a75\" (UID: \"26e38b2e-b3d0-4542-8878-d6c198983a75\") " Apr 16 13:22:18.331244 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.331082 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldtdn\" (UniqueName: \"kubernetes.io/projected/26e38b2e-b3d0-4542-8878-d6c198983a75-kube-api-access-ldtdn\") pod \"26e38b2e-b3d0-4542-8878-d6c198983a75\" (UID: \"26e38b2e-b3d0-4542-8878-d6c198983a75\") " Apr 16 13:22:18.337984 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.337945 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e38b2e-b3d0-4542-8878-d6c198983a75-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "26e38b2e-b3d0-4542-8878-d6c198983a75" (UID: "26e38b2e-b3d0-4542-8878-d6c198983a75"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:22:18.338640 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.338610 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e38b2e-b3d0-4542-8878-d6c198983a75-kube-api-access-ldtdn" (OuterVolumeSpecName: "kube-api-access-ldtdn") pod "26e38b2e-b3d0-4542-8878-d6c198983a75" (UID: "26e38b2e-b3d0-4542-8878-d6c198983a75"). InnerVolumeSpecName "kube-api-access-ldtdn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:22:18.350768 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.350735 2574 generic.go:358] "Generic (PLEG): container finished" podID="26e38b2e-b3d0-4542-8878-d6c198983a75" containerID="dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc" exitCode=0 Apr 16 13:22:18.350884 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.350814 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" Apr 16 13:22:18.350927 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.350804 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" event={"ID":"26e38b2e-b3d0-4542-8878-d6c198983a75","Type":"ContainerDied","Data":"dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc"} Apr 16 13:22:18.350963 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.350920 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5c9fb6b747-v6jg6" event={"ID":"26e38b2e-b3d0-4542-8878-d6c198983a75","Type":"ContainerDied","Data":"448f07c86e6880c24b29f7d625ea17d5d1e22c0855c8e88f6ea4ed9d4d83c7b4"} Apr 16 13:22:18.350963 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.350946 2574 scope.go:117] "RemoveContainer" containerID="dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc" Apr 16 13:22:18.361000 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.360979 2574 scope.go:117] "RemoveContainer" containerID="dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc" Apr 16 13:22:18.361255 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:22:18.361238 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc\": container with ID starting with dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc not found: ID does not exist" containerID="dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc" Apr 16 13:22:18.361309 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.361263 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc"} err="failed to get container status \"dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc\": rpc error: code = NotFound desc = could not find container \"dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc\": container with ID starting with dffb058313c280f4a0ff551384bb34ee894b6b984928ae965c206b67b17345fc not found: ID does not exist" Apr 16 13:22:18.374637 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.374590 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-5c9fb6b747-v6jg6"] Apr 16 13:22:18.378942 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.378914 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-5c9fb6b747-v6jg6"] Apr 16 13:22:18.432214 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.432178 2574 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/26e38b2e-b3d0-4542-8878-d6c198983a75-maas-api-tls\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:22:18.432214 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:18.432213 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ldtdn\" (UniqueName: \"kubernetes.io/projected/26e38b2e-b3d0-4542-8878-d6c198983a75-kube-api-access-ldtdn\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:22:19.884100 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:19.884062 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e38b2e-b3d0-4542-8878-d6c198983a75" path="/var/lib/kubelet/pods/26e38b2e-b3d0-4542-8878-d6c198983a75/volumes" Apr 16 13:22:23.335248 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.335223 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" Apr 16 13:22:23.628488 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628393 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-579b4f9c55-97k59"] Apr 16 13:22:23.628814 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628801 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af" containerName="authorino" Apr 16 13:22:23.628865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628816 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af" containerName="authorino" Apr 16 13:22:23.628865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628830 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d80e187a-1580-4cea-9342-c0673108c17d" containerName="authorino" Apr 16 13:22:23.628865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628836 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="d80e187a-1580-4cea-9342-c0673108c17d" containerName="authorino" Apr 16 13:22:23.628865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628847 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26e38b2e-b3d0-4542-8878-d6c198983a75" containerName="maas-api" Apr 16 13:22:23.628865 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628852 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e38b2e-b3d0-4542-8878-d6c198983a75" containerName="maas-api" Apr 16 13:22:23.629020 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628905 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="d80e187a-1580-4cea-9342-c0673108c17d" containerName="authorino" Apr 16 13:22:23.629020 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628916 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="81ddeba2-eb4c-4788-9ce1-6bf0d7fcf1af" containerName="authorino" Apr 16 13:22:23.629020 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.628923 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="26e38b2e-b3d0-4542-8878-d6c198983a75" containerName="maas-api" Apr 16 13:22:23.631945 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.631931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-579b4f9c55-97k59" Apr 16 13:22:23.637901 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.637875 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-579b4f9c55-97k59"] Apr 16 13:22:23.783180 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.783146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd5mz\" (UniqueName: \"kubernetes.io/projected/481ab357-361c-40b4-9b1a-7666d917ce71-kube-api-access-cd5mz\") pod \"maas-controller-579b4f9c55-97k59\" (UID: \"481ab357-361c-40b4-9b1a-7666d917ce71\") " pod="opendatahub/maas-controller-579b4f9c55-97k59" Apr 16 13:22:23.884286 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.884202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd5mz\" (UniqueName: \"kubernetes.io/projected/481ab357-361c-40b4-9b1a-7666d917ce71-kube-api-access-cd5mz\") pod \"maas-controller-579b4f9c55-97k59\" (UID: \"481ab357-361c-40b4-9b1a-7666d917ce71\") " pod="opendatahub/maas-controller-579b4f9c55-97k59" Apr 16 13:22:23.893712 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.893678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd5mz\" (UniqueName: \"kubernetes.io/projected/481ab357-361c-40b4-9b1a-7666d917ce71-kube-api-access-cd5mz\") pod \"maas-controller-579b4f9c55-97k59\" (UID: \"481ab357-361c-40b4-9b1a-7666d917ce71\") " pod="opendatahub/maas-controller-579b4f9c55-97k59" Apr 16 13:22:23.943288 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:23.943249 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-579b4f9c55-97k59" Apr 16 13:22:24.066781 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:24.066754 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-579b4f9c55-97k59"] Apr 16 13:22:24.069619 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:22:24.069590 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod481ab357_361c_40b4_9b1a_7666d917ce71.slice/crio-307cc4f0f0784e797cb8ceb5b09271e57aa81910482e9115595693c06c8dc1e2 WatchSource:0}: Error finding container 307cc4f0f0784e797cb8ceb5b09271e57aa81910482e9115595693c06c8dc1e2: Status 404 returned error can't find the container with id 307cc4f0f0784e797cb8ceb5b09271e57aa81910482e9115595693c06c8dc1e2 Apr 16 13:22:24.376038 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:24.376007 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-579b4f9c55-97k59" event={"ID":"481ab357-361c-40b4-9b1a-7666d917ce71","Type":"ContainerStarted","Data":"307cc4f0f0784e797cb8ceb5b09271e57aa81910482e9115595693c06c8dc1e2"} Apr 16 13:22:25.383998 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:25.383961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-579b4f9c55-97k59" event={"ID":"481ab357-361c-40b4-9b1a-7666d917ce71","Type":"ContainerStarted","Data":"3ee6cd0977cc187dea09257ebcebcec140a0819518a4cd000e030b0ca984950b"} Apr 16 13:22:25.384386 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:25.384032 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-579b4f9c55-97k59" Apr 16 13:22:25.401528 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:25.401481 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-579b4f9c55-97k59" podStartSLOduration=2.052460962 podStartE2EDuration="2.401465423s" podCreationTimestamp="2026-04-16 13:22:23 +0000 UTC" firstStartedPulling="2026-04-16 13:22:24.070881783 +0000 UTC m=+646.760813576" lastFinishedPulling="2026-04-16 13:22:24.41988623 +0000 UTC m=+647.109818037" observedRunningTime="2026-04-16 13:22:25.398674731 +0000 UTC m=+648.088606586" watchObservedRunningTime="2026-04-16 13:22:25.401465423 +0000 UTC m=+648.091397233" Apr 16 13:22:36.394439 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:36.394397 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-579b4f9c55-97k59" Apr 16 13:22:36.434191 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:36.434159 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f7984c5cd-nl8tl"] Apr 16 13:22:36.434508 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:36.434446 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" podUID="3db4cc7d-8b99-4697-b428-c846f7e3a3ab" containerName="manager" containerID="cri-o://1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d" gracePeriod=10 Apr 16 13:22:36.684858 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:36.684835 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" Apr 16 13:22:36.806841 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:36.806805 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzrqb\" (UniqueName: \"kubernetes.io/projected/3db4cc7d-8b99-4697-b428-c846f7e3a3ab-kube-api-access-lzrqb\") pod \"3db4cc7d-8b99-4697-b428-c846f7e3a3ab\" (UID: \"3db4cc7d-8b99-4697-b428-c846f7e3a3ab\") " Apr 16 13:22:36.808922 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:36.808895 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db4cc7d-8b99-4697-b428-c846f7e3a3ab-kube-api-access-lzrqb" (OuterVolumeSpecName: "kube-api-access-lzrqb") pod "3db4cc7d-8b99-4697-b428-c846f7e3a3ab" (UID: "3db4cc7d-8b99-4697-b428-c846f7e3a3ab"). InnerVolumeSpecName "kube-api-access-lzrqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:22:36.908224 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:36.908165 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzrqb\" (UniqueName: \"kubernetes.io/projected/3db4cc7d-8b99-4697-b428-c846f7e3a3ab-kube-api-access-lzrqb\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:22:37.273764 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.273729 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 13:22:37.273976 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.273950 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" containerID="cri-o://c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2" gracePeriod=30 Apr 16 13:22:37.434443 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.434405 2574 generic.go:358] "Generic (PLEG): container finished" podID="3db4cc7d-8b99-4697-b428-c846f7e3a3ab" containerID="1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d" exitCode=0 Apr 16 13:22:37.434942 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.434498 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" Apr 16 13:22:37.434942 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.434497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" event={"ID":"3db4cc7d-8b99-4697-b428-c846f7e3a3ab","Type":"ContainerDied","Data":"1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d"} Apr 16 13:22:37.434942 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.434601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-f7984c5cd-nl8tl" event={"ID":"3db4cc7d-8b99-4697-b428-c846f7e3a3ab","Type":"ContainerDied","Data":"f49cb763a19a942e644a48570391fe927ed2766f26cf5039b5a548eb98d52a7e"} Apr 16 13:22:37.434942 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.434618 2574 scope.go:117] "RemoveContainer" containerID="1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d" Apr 16 13:22:37.444002 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.443988 2574 scope.go:117] "RemoveContainer" containerID="1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d" Apr 16 13:22:37.444252 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:22:37.444237 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d\": container with ID starting with 1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d not found: ID does not exist" containerID="1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d" Apr 16 13:22:37.444322 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.444258 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d"} err="failed to get container status \"1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d\": rpc error: code = NotFound desc = could not find container \"1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d\": container with ID starting with 1bd180fe31d6d81c19fd5415befecd3ecc1c1fb1a40fe5c8148a2b0708c9020d not found: ID does not exist" Apr 16 13:22:37.457044 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.457015 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-f7984c5cd-nl8tl"] Apr 16 13:22:37.460189 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.460166 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-f7984c5cd-nl8tl"] Apr 16 13:22:37.885102 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:37.885070 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db4cc7d-8b99-4697-b428-c846f7e3a3ab" path="/var/lib/kubelet/pods/3db4cc7d-8b99-4697-b428-c846f7e3a3ab/volumes" Apr 16 13:22:38.679564 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:22:38.679531 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa6163cd_bbd1_4f1a_a7c7_7c5474840715.slice/crio-ab7230b1206c92f67ae4c791cb85a9af75a976b2b56d67381d8233f06013b709\": RecentStats: unable to find data in memory cache]" Apr 16 13:22:38.680013 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:22:38.679721 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa6163cd_bbd1_4f1a_a7c7_7c5474840715.slice/crio-ab7230b1206c92f67ae4c791cb85a9af75a976b2b56d67381d8233f06013b709\": RecentStats: unable to find data in memory cache]" Apr 16 13:22:39.331021 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.330997 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.429316 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.429282 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m9tp\" (UniqueName: \"kubernetes.io/projected/aa6163cd-bbd1-4f1a-a7c7-7c5474840715-kube-api-access-7m9tp\") pod \"aa6163cd-bbd1-4f1a-a7c7-7c5474840715\" (UID: \"aa6163cd-bbd1-4f1a-a7c7-7c5474840715\") " Apr 16 13:22:39.431525 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.431496 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6163cd-bbd1-4f1a-a7c7-7c5474840715-kube-api-access-7m9tp" (OuterVolumeSpecName: "kube-api-access-7m9tp") pod "aa6163cd-bbd1-4f1a-a7c7-7c5474840715" (UID: "aa6163cd-bbd1-4f1a-a7c7-7c5474840715"). InnerVolumeSpecName "kube-api-access-7m9tp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:22:39.447192 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.447114 2574 generic.go:358] "Generic (PLEG): container finished" podID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerID="c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2" exitCode=143 Apr 16 13:22:39.447192 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.447146 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"aa6163cd-bbd1-4f1a-a7c7-7c5474840715","Type":"ContainerDied","Data":"c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2"} Apr 16 13:22:39.447192 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.447177 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.447192 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.447186 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"aa6163cd-bbd1-4f1a-a7c7-7c5474840715","Type":"ContainerDied","Data":"ab7230b1206c92f67ae4c791cb85a9af75a976b2b56d67381d8233f06013b709"} Apr 16 13:22:39.447444 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.447202 2574 scope.go:117] "RemoveContainer" containerID="c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2" Apr 16 13:22:39.457001 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.456980 2574 scope.go:117] "RemoveContainer" containerID="c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2" Apr 16 13:22:39.457298 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:22:39.457276 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2\": container with ID starting with c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2 not found: ID does not exist" containerID="c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2" Apr 16 13:22:39.457349 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.457307 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2"} err="failed to get container status \"c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2\": rpc error: code = NotFound desc = could not find container \"c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2\": container with ID starting with c22903f089e9d08adeb7b6ee900e529939a63b1eb894a4c8786aad69641454f2 not found: ID does not exist" Apr 16 13:22:39.469946 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.469917 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 13:22:39.474202 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.474177 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 13:22:39.492381 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.492351 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 13:22:39.492750 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.492735 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" Apr 16 13:22:39.492797 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.492752 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" Apr 16 13:22:39.492797 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.492770 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3db4cc7d-8b99-4697-b428-c846f7e3a3ab" containerName="manager" Apr 16 13:22:39.492797 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.492775 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db4cc7d-8b99-4697-b428-c846f7e3a3ab" containerName="manager" Apr 16 13:22:39.492905 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.492834 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3db4cc7d-8b99-4697-b428-c846f7e3a3ab" containerName="manager" Apr 16 13:22:39.492905 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.492846 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" containerName="keycloak" Apr 16 13:22:39.497174 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.497156 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.500874 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.500620 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 16 13:22:39.500874 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.500644 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 16 13:22:39.500874 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.500689 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 16 13:22:39.500874 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.500769 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 16 13:22:39.500874 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.500779 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-l7jss\"" Apr 16 13:22:39.504420 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.504392 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 13:22:39.530270 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.530234 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7m9tp\" (UniqueName: \"kubernetes.io/projected/aa6163cd-bbd1-4f1a-a7c7-7c5474840715-kube-api-access-7m9tp\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:22:39.631158 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.631122 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddk9b\" (UniqueName: \"kubernetes.io/projected/db328bdd-c569-4272-b9c1-67ec54907a5e-kube-api-access-ddk9b\") pod \"maas-keycloak-0\" (UID: \"db328bdd-c569-4272-b9c1-67ec54907a5e\") " pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.631342 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.631189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/db328bdd-c569-4272-b9c1-67ec54907a5e-test-realms\") pod \"maas-keycloak-0\" (UID: \"db328bdd-c569-4272-b9c1-67ec54907a5e\") " pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.731715 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.731611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/db328bdd-c569-4272-b9c1-67ec54907a5e-test-realms\") pod \"maas-keycloak-0\" (UID: \"db328bdd-c569-4272-b9c1-67ec54907a5e\") " pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.731715 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.731710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddk9b\" (UniqueName: \"kubernetes.io/projected/db328bdd-c569-4272-b9c1-67ec54907a5e-kube-api-access-ddk9b\") pod \"maas-keycloak-0\" (UID: \"db328bdd-c569-4272-b9c1-67ec54907a5e\") " pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.732305 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.732287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/db328bdd-c569-4272-b9c1-67ec54907a5e-test-realms\") pod \"maas-keycloak-0\" (UID: \"db328bdd-c569-4272-b9c1-67ec54907a5e\") " pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.741658 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.741628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddk9b\" (UniqueName: \"kubernetes.io/projected/db328bdd-c569-4272-b9c1-67ec54907a5e-kube-api-access-ddk9b\") pod \"maas-keycloak-0\" (UID: \"db328bdd-c569-4272-b9c1-67ec54907a5e\") " pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.808290 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.808252 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:39.883989 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.883958 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6163cd-bbd1-4f1a-a7c7-7c5474840715" path="/var/lib/kubelet/pods/aa6163cd-bbd1-4f1a-a7c7-7c5474840715/volumes" Apr 16 13:22:39.935829 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:39.935804 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 16 13:22:39.937256 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:22:39.937228 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb328bdd_c569_4272_b9c1_67ec54907a5e.slice/crio-8d0b18e7ce99ad910eab0d962a28c301b08f90808b8b77688dcc3df692ab4cf7 WatchSource:0}: Error finding container 8d0b18e7ce99ad910eab0d962a28c301b08f90808b8b77688dcc3df692ab4cf7: Status 404 returned error can't find the container with id 8d0b18e7ce99ad910eab0d962a28c301b08f90808b8b77688dcc3df692ab4cf7 Apr 16 13:22:40.453299 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:40.453258 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"db328bdd-c569-4272-b9c1-67ec54907a5e","Type":"ContainerStarted","Data":"28df13cc2b1b7888c1ce610de3e293058308a12339c6e3144d9d99e82f6c53d8"} Apr 16 13:22:40.453299 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:40.453304 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"db328bdd-c569-4272-b9c1-67ec54907a5e","Type":"ContainerStarted","Data":"8d0b18e7ce99ad910eab0d962a28c301b08f90808b8b77688dcc3df692ab4cf7"} Apr 16 13:22:40.475788 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:40.475726 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.119389113 podStartE2EDuration="1.475711093s" podCreationTimestamp="2026-04-16 13:22:39 +0000 UTC" firstStartedPulling="2026-04-16 13:22:39.938671746 +0000 UTC m=+662.628603539" lastFinishedPulling="2026-04-16 13:22:40.29499371 +0000 UTC m=+662.984925519" observedRunningTime="2026-04-16 13:22:40.472405068 +0000 UTC m=+663.162336884" watchObservedRunningTime="2026-04-16 13:22:40.475711093 +0000 UTC m=+663.165642908" Apr 16 13:22:40.809012 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:40.808967 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:40.810944 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:40.810907 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:41.808943 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:41.808896 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:42.809338 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:42.809286 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:43.808955 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:43.808899 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:44.809006 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:44.808940 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:45.809436 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:45.809389 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:46.808779 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:46.808724 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:47.809270 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:47.809219 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:48.809532 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:48.809468 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:49.808915 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:49.808868 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:49.809275 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:49.809237 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:50.809237 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:50.809185 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:51.809668 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:51.809615 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:52.809403 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:52.809351 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.55:9000/health/started\": dial tcp 10.134.0.55:9000: connect: connection refused" Apr 16 13:22:53.904539 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:53.904491 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 16 13:22:53.925089 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:22:53.925032 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 13:23:03.909690 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:03.909635 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="db328bdd-c569-4272-b9c1-67ec54907a5e" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 13:23:13.911218 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:13.911177 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 16 13:23:24.590609 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.590522 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-69dfb7f775-96s5b"] Apr 16 13:23:24.594031 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.594012 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.596889 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.596869 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 16 13:23:24.602752 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.602729 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-69dfb7f775-96s5b"] Apr 16 13:23:24.757853 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.757812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64jk\" (UniqueName: \"kubernetes.io/projected/2eac0928-88b6-48f1-ade6-2044e25388ea-kube-api-access-s64jk\") pod \"authorino-69dfb7f775-96s5b\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.758029 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.757885 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2eac0928-88b6-48f1-ade6-2044e25388ea-oidc-ca\") pod \"authorino-69dfb7f775-96s5b\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.758029 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.757920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2eac0928-88b6-48f1-ade6-2044e25388ea-tls-cert\") pod \"authorino-69dfb7f775-96s5b\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.858654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.858550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2eac0928-88b6-48f1-ade6-2044e25388ea-tls-cert\") pod \"authorino-69dfb7f775-96s5b\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.858654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.858618 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s64jk\" (UniqueName: \"kubernetes.io/projected/2eac0928-88b6-48f1-ade6-2044e25388ea-kube-api-access-s64jk\") pod \"authorino-69dfb7f775-96s5b\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.858905 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.858694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2eac0928-88b6-48f1-ade6-2044e25388ea-oidc-ca\") pod \"authorino-69dfb7f775-96s5b\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.859395 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.859370 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2eac0928-88b6-48f1-ade6-2044e25388ea-oidc-ca\") pod \"authorino-69dfb7f775-96s5b\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.861003 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.860981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2eac0928-88b6-48f1-ade6-2044e25388ea-tls-cert\") pod \"authorino-69dfb7f775-96s5b\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.867436 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.867416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64jk\" (UniqueName: \"kubernetes.io/projected/2eac0928-88b6-48f1-ade6-2044e25388ea-kube-api-access-s64jk\") pod \"authorino-69dfb7f775-96s5b\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:24.904313 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:24.904274 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:23:25.058467 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:25.058428 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-69dfb7f775-96s5b"] Apr 16 13:23:25.059093 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:23:25.059064 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eac0928_88b6_48f1_ade6_2044e25388ea.slice/crio-c3c9d3397b352471bc12408cd0aa32c62803f954cb348978697dc8529d5ebdf5 WatchSource:0}: Error finding container c3c9d3397b352471bc12408cd0aa32c62803f954cb348978697dc8529d5ebdf5: Status 404 returned error can't find the container with id c3c9d3397b352471bc12408cd0aa32c62803f954cb348978697dc8529d5ebdf5 Apr 16 13:23:25.667916 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:25.667876 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-69dfb7f775-96s5b" event={"ID":"2eac0928-88b6-48f1-ade6-2044e25388ea","Type":"ContainerStarted","Data":"9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713"} Apr 16 13:23:25.668307 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:25.667922 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-69dfb7f775-96s5b" event={"ID":"2eac0928-88b6-48f1-ade6-2044e25388ea","Type":"ContainerStarted","Data":"c3c9d3397b352471bc12408cd0aa32c62803f954cb348978697dc8529d5ebdf5"} Apr 16 13:23:25.687085 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:25.686990 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-69dfb7f775-96s5b" podStartSLOduration=1.326641261 podStartE2EDuration="1.686976482s" podCreationTimestamp="2026-04-16 13:23:24 +0000 UTC" firstStartedPulling="2026-04-16 13:23:25.060374055 +0000 UTC m=+707.750305849" lastFinishedPulling="2026-04-16 13:23:25.420709277 +0000 UTC m=+708.110641070" observedRunningTime="2026-04-16 13:23:25.684617442 +0000 UTC m=+708.374549259" watchObservedRunningTime="2026-04-16 13:23:25.686976482 +0000 UTC m=+708.376908296" Apr 16 13:23:25.713049 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:25.713016 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57b4dcbf6f-zncdd"] Apr 16 13:23:25.713270 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:25.713243 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" podUID="85ff623e-2399-460d-80ad-8b7f75994436" containerName="authorino" containerID="cri-o://8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1" gracePeriod=30 Apr 16 13:23:26.020907 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.020880 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:23:26.169746 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.169689 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5qn7\" (UniqueName: \"kubernetes.io/projected/85ff623e-2399-460d-80ad-8b7f75994436-kube-api-access-j5qn7\") pod \"85ff623e-2399-460d-80ad-8b7f75994436\" (UID: \"85ff623e-2399-460d-80ad-8b7f75994436\") " Apr 16 13:23:26.169935 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.169890 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/85ff623e-2399-460d-80ad-8b7f75994436-tls-cert\") pod \"85ff623e-2399-460d-80ad-8b7f75994436\" (UID: \"85ff623e-2399-460d-80ad-8b7f75994436\") " Apr 16 13:23:26.171935 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.171900 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ff623e-2399-460d-80ad-8b7f75994436-kube-api-access-j5qn7" (OuterVolumeSpecName: "kube-api-access-j5qn7") pod "85ff623e-2399-460d-80ad-8b7f75994436" (UID: "85ff623e-2399-460d-80ad-8b7f75994436"). InnerVolumeSpecName "kube-api-access-j5qn7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:23:26.180739 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.180713 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ff623e-2399-460d-80ad-8b7f75994436-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "85ff623e-2399-460d-80ad-8b7f75994436" (UID: "85ff623e-2399-460d-80ad-8b7f75994436"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:23:26.271430 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.271332 2574 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/85ff623e-2399-460d-80ad-8b7f75994436-tls-cert\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:23:26.271430 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.271374 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j5qn7\" (UniqueName: \"kubernetes.io/projected/85ff623e-2399-460d-80ad-8b7f75994436-kube-api-access-j5qn7\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:23:26.672948 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.672913 2574 generic.go:358] "Generic (PLEG): container finished" podID="85ff623e-2399-460d-80ad-8b7f75994436" containerID="8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1" exitCode=0 Apr 16 13:23:26.673344 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.672961 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" Apr 16 13:23:26.673344 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.672994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" event={"ID":"85ff623e-2399-460d-80ad-8b7f75994436","Type":"ContainerDied","Data":"8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1"} Apr 16 13:23:26.673344 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.673033 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-57b4dcbf6f-zncdd" event={"ID":"85ff623e-2399-460d-80ad-8b7f75994436","Type":"ContainerDied","Data":"5190146b367ad0137c52d8bb6125a9abc56bd80ddbad157c3bddf16ea8a7b028"} Apr 16 13:23:26.673344 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.673048 2574 scope.go:117] "RemoveContainer" containerID="8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1" Apr 16 13:23:26.682345 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.682327 2574 scope.go:117] "RemoveContainer" containerID="8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1" Apr 16 13:23:26.682624 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:23:26.682605 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1\": container with ID starting with 8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1 not found: ID does not exist" containerID="8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1" Apr 16 13:23:26.682677 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.682634 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1"} err="failed to get container status \"8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1\": rpc error: code = NotFound desc = could not find container \"8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1\": container with ID starting with 8683df0a24fb5df147a79c77e66d7e8abbf5063a7c0372349699e27f11708ec1 not found: ID does not exist" Apr 16 13:23:26.698061 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.698029 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-57b4dcbf6f-zncdd"] Apr 16 13:23:26.703604 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:26.703577 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-57b4dcbf6f-zncdd"] Apr 16 13:23:27.883253 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:27.883225 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ff623e-2399-460d-80ad-8b7f75994436" path="/var/lib/kubelet/pods/85ff623e-2399-460d-80ad-8b7f75994436/volumes" Apr 16 13:23:28.276902 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.276869 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k"] Apr 16 13:23:28.277247 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.277236 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="85ff623e-2399-460d-80ad-8b7f75994436" containerName="authorino" Apr 16 13:23:28.277293 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.277249 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ff623e-2399-460d-80ad-8b7f75994436" containerName="authorino" Apr 16 13:23:28.277330 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.277323 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="85ff623e-2399-460d-80ad-8b7f75994436" containerName="authorino" Apr 16 13:23:28.282417 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.282400 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.286504 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.286480 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 13:23:28.286504 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.286480 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 13:23:28.286835 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.286546 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 13:23:28.286835 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.286536 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-br65d\"" Apr 16 13:23:28.290240 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.290187 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k"] Apr 16 13:23:28.389857 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.389821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.389857 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.389858 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.390123 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.389891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndjhz\" (UniqueName: \"kubernetes.io/projected/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-kube-api-access-ndjhz\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.390123 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.390029 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.390123 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.390105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.390269 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.390134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.491267 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.491230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.491429 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.491298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.491429 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.491328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.491429 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.491394 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.491429 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.491416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.491694 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.491443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndjhz\" (UniqueName: \"kubernetes.io/projected/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-kube-api-access-ndjhz\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.491842 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.491815 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.491910 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.491830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.491910 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.491884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.493789 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.493767 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.493962 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.493942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.499448 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.499420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndjhz\" (UniqueName: \"kubernetes.io/projected/b91ffcd7-1ce0-4a77-9d1c-e9dced46011c-kube-api-access-ndjhz\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-qv84k\" (UID: \"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.594379 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.594271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:28.723109 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:28.723080 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k"] Apr 16 13:23:28.724148 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:23:28.724111 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb91ffcd7_1ce0_4a77_9d1c_e9dced46011c.slice/crio-5378224fdff497ab218bf370956d816bbce4bf0f36ed38edba101353032924a7 WatchSource:0}: Error finding container 5378224fdff497ab218bf370956d816bbce4bf0f36ed38edba101353032924a7: Status 404 returned error can't find the container with id 5378224fdff497ab218bf370956d816bbce4bf0f36ed38edba101353032924a7 Apr 16 13:23:29.688277 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:29.688241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" event={"ID":"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c","Type":"ContainerStarted","Data":"5378224fdff497ab218bf370956d816bbce4bf0f36ed38edba101353032924a7"} Apr 16 13:23:35.717011 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:35.716970 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" event={"ID":"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c","Type":"ContainerStarted","Data":"5ffe2193044f2798994ca8d29ef4194e8cd93b256b4257c1ccc87efbeec0399e"} Apr 16 13:23:38.702232 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.702197 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc"] Apr 16 13:23:38.707132 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.707113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.710196 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.710165 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 13:23:38.715426 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.715398 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc"] Apr 16 13:23:38.781164 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.781130 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.781164 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.781170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr9v7\" (UniqueName: \"kubernetes.io/projected/1dea6490-fe12-4060-9f3d-c8dc13995739-kube-api-access-nr9v7\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.781381 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.781248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.781381 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.781290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1dea6490-fe12-4060-9f3d-c8dc13995739-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.781381 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.781321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.781381 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.781344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.882078 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.882042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.882078 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.882084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1dea6490-fe12-4060-9f3d-c8dc13995739-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.882322 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.882105 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.882322 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.882126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.882322 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.882177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.882322 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.882197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr9v7\" (UniqueName: \"kubernetes.io/projected/1dea6490-fe12-4060-9f3d-c8dc13995739-kube-api-access-nr9v7\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.882561 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.882497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.882620 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.882557 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.882620 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.882592 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.884426 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.884396 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1dea6490-fe12-4060-9f3d-c8dc13995739-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.884882 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.884861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1dea6490-fe12-4060-9f3d-c8dc13995739-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:38.889888 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:38.889860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr9v7\" (UniqueName: \"kubernetes.io/projected/1dea6490-fe12-4060-9f3d-c8dc13995739-kube-api-access-nr9v7\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc\" (UID: \"1dea6490-fe12-4060-9f3d-c8dc13995739\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:39.019956 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:39.019860 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:39.168532 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:39.168221 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc"] Apr 16 13:23:39.735419 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:39.735377 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" event={"ID":"1dea6490-fe12-4060-9f3d-c8dc13995739","Type":"ContainerStarted","Data":"40c48bd68641bcd24f362a074e9b2bff4ab9616ab1bc0ded1a64e408395d2a26"} Apr 16 13:23:39.735419 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:39.735425 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" event={"ID":"1dea6490-fe12-4060-9f3d-c8dc13995739","Type":"ContainerStarted","Data":"fa47d03dc8c07dba0dae1ede3782d17feeb6e6f9109c1812b7aee25cface5394"} Apr 16 13:23:40.740426 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:40.740344 2574 generic.go:358] "Generic (PLEG): container finished" podID="b91ffcd7-1ce0-4a77-9d1c-e9dced46011c" containerID="5ffe2193044f2798994ca8d29ef4194e8cd93b256b4257c1ccc87efbeec0399e" exitCode=0 Apr 16 13:23:40.740780 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:40.740422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" event={"ID":"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c","Type":"ContainerDied","Data":"5ffe2193044f2798994ca8d29ef4194e8cd93b256b4257c1ccc87efbeec0399e"} Apr 16 13:23:44.762159 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:44.762120 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" event={"ID":"b91ffcd7-1ce0-4a77-9d1c-e9dced46011c","Type":"ContainerStarted","Data":"200ef786255d2bbf93a1bc0dbd00974a00a766e87e42f83fd2be8f1f67fbdd6d"} Apr 16 13:23:44.762574 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:44.762345 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:44.785776 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:44.785656 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" podStartSLOduration=0.981220873 podStartE2EDuration="16.78563863s" podCreationTimestamp="2026-04-16 13:23:28 +0000 UTC" firstStartedPulling="2026-04-16 13:23:28.726085851 +0000 UTC m=+711.416017650" lastFinishedPulling="2026-04-16 13:23:44.530503613 +0000 UTC m=+727.220435407" observedRunningTime="2026-04-16 13:23:44.782912356 +0000 UTC m=+727.472844172" watchObservedRunningTime="2026-04-16 13:23:44.78563863 +0000 UTC m=+727.475570445" Apr 16 13:23:45.768272 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:45.768187 2574 generic.go:358] "Generic (PLEG): container finished" podID="1dea6490-fe12-4060-9f3d-c8dc13995739" containerID="40c48bd68641bcd24f362a074e9b2bff4ab9616ab1bc0ded1a64e408395d2a26" exitCode=0 Apr 16 13:23:45.768272 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:45.768260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" event={"ID":"1dea6490-fe12-4060-9f3d-c8dc13995739","Type":"ContainerDied","Data":"40c48bd68641bcd24f362a074e9b2bff4ab9616ab1bc0ded1a64e408395d2a26"} Apr 16 13:23:46.773863 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:46.773829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" event={"ID":"1dea6490-fe12-4060-9f3d-c8dc13995739","Type":"ContainerStarted","Data":"18ebec64c21daaa2f4d1f572a37562a7d1675ea741ba1fe715257d72e796fb45"} Apr 16 13:23:46.774248 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:46.774039 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:23:46.796504 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:46.796421 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" podStartSLOduration=8.480823538 podStartE2EDuration="8.79640603s" podCreationTimestamp="2026-04-16 13:23:38 +0000 UTC" firstStartedPulling="2026-04-16 13:23:45.769022946 +0000 UTC m=+728.458954739" lastFinishedPulling="2026-04-16 13:23:46.084605432 +0000 UTC m=+728.774537231" observedRunningTime="2026-04-16 13:23:46.795041508 +0000 UTC m=+729.484973333" watchObservedRunningTime="2026-04-16 13:23:46.79640603 +0000 UTC m=+729.486337895" Apr 16 13:23:55.781819 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:55.781788 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-qv84k" Apr 16 13:23:57.791338 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:23:57.791308 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc" Apr 16 13:24:59.363358 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.363326 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5f7fc7cc9d-nl827"] Apr 16 13:24:59.367067 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.367047 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.373671 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.373644 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5f7fc7cc9d-nl827"] Apr 16 13:24:59.521571 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.521534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1378e342-70ff-4ace-bcc1-e6833128a3d5-tls-cert\") pod \"authorino-5f7fc7cc9d-nl827\" (UID: \"1378e342-70ff-4ace-bcc1-e6833128a3d5\") " pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.521753 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.521608 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1378e342-70ff-4ace-bcc1-e6833128a3d5-oidc-ca\") pod \"authorino-5f7fc7cc9d-nl827\" (UID: \"1378e342-70ff-4ace-bcc1-e6833128a3d5\") " pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.521753 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.521659 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzhww\" (UniqueName: \"kubernetes.io/projected/1378e342-70ff-4ace-bcc1-e6833128a3d5-kube-api-access-gzhww\") pod \"authorino-5f7fc7cc9d-nl827\" (UID: \"1378e342-70ff-4ace-bcc1-e6833128a3d5\") " pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.622947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.622854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1378e342-70ff-4ace-bcc1-e6833128a3d5-tls-cert\") pod \"authorino-5f7fc7cc9d-nl827\" (UID: \"1378e342-70ff-4ace-bcc1-e6833128a3d5\") " pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.622947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.622927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1378e342-70ff-4ace-bcc1-e6833128a3d5-oidc-ca\") pod \"authorino-5f7fc7cc9d-nl827\" (UID: \"1378e342-70ff-4ace-bcc1-e6833128a3d5\") " pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.623147 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.622957 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzhww\" (UniqueName: \"kubernetes.io/projected/1378e342-70ff-4ace-bcc1-e6833128a3d5-kube-api-access-gzhww\") pod \"authorino-5f7fc7cc9d-nl827\" (UID: \"1378e342-70ff-4ace-bcc1-e6833128a3d5\") " pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.623647 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.623629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/1378e342-70ff-4ace-bcc1-e6833128a3d5-oidc-ca\") pod \"authorino-5f7fc7cc9d-nl827\" (UID: \"1378e342-70ff-4ace-bcc1-e6833128a3d5\") " pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.625232 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.625213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1378e342-70ff-4ace-bcc1-e6833128a3d5-tls-cert\") pod \"authorino-5f7fc7cc9d-nl827\" (UID: \"1378e342-70ff-4ace-bcc1-e6833128a3d5\") " pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.630563 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.630539 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzhww\" (UniqueName: \"kubernetes.io/projected/1378e342-70ff-4ace-bcc1-e6833128a3d5-kube-api-access-gzhww\") pod \"authorino-5f7fc7cc9d-nl827\" (UID: \"1378e342-70ff-4ace-bcc1-e6833128a3d5\") " pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.677566 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.677526 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" Apr 16 13:24:59.803647 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:24:59.803615 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5f7fc7cc9d-nl827"] Apr 16 13:24:59.804895 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:24:59.804871 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1378e342_70ff_4ace_bcc1_e6833128a3d5.slice/crio-b538ce25ea82d4c0b9b49efa253c8264121b28269168f427d0a669d2eca8ce32 WatchSource:0}: Error finding container b538ce25ea82d4c0b9b49efa253c8264121b28269168f427d0a669d2eca8ce32: Status 404 returned error can't find the container with id b538ce25ea82d4c0b9b49efa253c8264121b28269168f427d0a669d2eca8ce32 Apr 16 13:25:00.073343 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:00.073298 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" event={"ID":"1378e342-70ff-4ace-bcc1-e6833128a3d5","Type":"ContainerStarted","Data":"b538ce25ea82d4c0b9b49efa253c8264121b28269168f427d0a669d2eca8ce32"} Apr 16 13:25:01.082727 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.082683 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" event={"ID":"1378e342-70ff-4ace-bcc1-e6833128a3d5","Type":"ContainerStarted","Data":"ab222b2dd0249b0afe28f574b31c4bac9f45b908995ec8a5bbe30263603ef985"} Apr 16 13:25:01.100056 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.100000 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5f7fc7cc9d-nl827" podStartSLOduration=1.71241389 podStartE2EDuration="2.099984927s" podCreationTimestamp="2026-04-16 13:24:59 +0000 UTC" firstStartedPulling="2026-04-16 13:24:59.806211287 +0000 UTC m=+802.496143080" lastFinishedPulling="2026-04-16 13:25:00.193782323 +0000 UTC m=+802.883714117" observedRunningTime="2026-04-16 13:25:01.097590146 +0000 UTC m=+803.787521961" watchObservedRunningTime="2026-04-16 13:25:01.099984927 +0000 UTC m=+803.789916742" Apr 16 13:25:01.124882 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.124849 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-69dfb7f775-96s5b"] Apr 16 13:25:01.125129 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.125104 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-69dfb7f775-96s5b" podUID="2eac0928-88b6-48f1-ade6-2044e25388ea" containerName="authorino" containerID="cri-o://9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713" gracePeriod=30 Apr 16 13:25:01.369382 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.369358 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:25:01.542781 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.542730 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s64jk\" (UniqueName: \"kubernetes.io/projected/2eac0928-88b6-48f1-ade6-2044e25388ea-kube-api-access-s64jk\") pod \"2eac0928-88b6-48f1-ade6-2044e25388ea\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " Apr 16 13:25:01.542971 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.542828 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2eac0928-88b6-48f1-ade6-2044e25388ea-oidc-ca\") pod \"2eac0928-88b6-48f1-ade6-2044e25388ea\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " Apr 16 13:25:01.542971 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.542872 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2eac0928-88b6-48f1-ade6-2044e25388ea-tls-cert\") pod \"2eac0928-88b6-48f1-ade6-2044e25388ea\" (UID: \"2eac0928-88b6-48f1-ade6-2044e25388ea\") " Apr 16 13:25:01.544946 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.544916 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eac0928-88b6-48f1-ade6-2044e25388ea-kube-api-access-s64jk" (OuterVolumeSpecName: "kube-api-access-s64jk") pod "2eac0928-88b6-48f1-ade6-2044e25388ea" (UID: "2eac0928-88b6-48f1-ade6-2044e25388ea"). InnerVolumeSpecName "kube-api-access-s64jk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:25:01.547986 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.547961 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eac0928-88b6-48f1-ade6-2044e25388ea-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "2eac0928-88b6-48f1-ade6-2044e25388ea" (UID: "2eac0928-88b6-48f1-ade6-2044e25388ea"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:25:01.555069 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.555041 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eac0928-88b6-48f1-ade6-2044e25388ea-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "2eac0928-88b6-48f1-ade6-2044e25388ea" (UID: "2eac0928-88b6-48f1-ade6-2044e25388ea"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:25:01.643550 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.643500 2574 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2eac0928-88b6-48f1-ade6-2044e25388ea-tls-cert\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:25:01.643550 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.643549 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s64jk\" (UniqueName: \"kubernetes.io/projected/2eac0928-88b6-48f1-ade6-2044e25388ea-kube-api-access-s64jk\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:25:01.643741 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:01.643564 2574 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2eac0928-88b6-48f1-ade6-2044e25388ea-oidc-ca\") on node \"ip-10-0-137-242.ec2.internal\" DevicePath \"\"" Apr 16 13:25:02.088299 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:02.088264 2574 generic.go:358] "Generic (PLEG): container finished" podID="2eac0928-88b6-48f1-ade6-2044e25388ea" containerID="9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713" exitCode=0 Apr 16 13:25:02.088804 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:02.088318 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-69dfb7f775-96s5b" Apr 16 13:25:02.088804 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:02.088312 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-69dfb7f775-96s5b" event={"ID":"2eac0928-88b6-48f1-ade6-2044e25388ea","Type":"ContainerDied","Data":"9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713"} Apr 16 13:25:02.088804 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:02.088424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-69dfb7f775-96s5b" event={"ID":"2eac0928-88b6-48f1-ade6-2044e25388ea","Type":"ContainerDied","Data":"c3c9d3397b352471bc12408cd0aa32c62803f954cb348978697dc8529d5ebdf5"} Apr 16 13:25:02.088804 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:02.088444 2574 scope.go:117] "RemoveContainer" containerID="9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713" Apr 16 13:25:02.098750 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:02.098729 2574 scope.go:117] "RemoveContainer" containerID="9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713" Apr 16 13:25:02.099077 ip-10-0-137-242 kubenswrapper[2574]: E0416 13:25:02.099053 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713\": container with ID starting with 9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713 not found: ID does not exist" containerID="9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713" Apr 16 13:25:02.099127 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:02.099087 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713"} err="failed to get container status \"9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713\": rpc error: code = NotFound desc = could not find container \"9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713\": container with ID starting with 9d3d3434291cfd5a9c5be55f5cd68e23e7a310dff3cd12cbbe74b27c97960713 not found: ID does not exist" Apr 16 13:25:02.108697 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:02.108659 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-69dfb7f775-96s5b"] Apr 16 13:25:02.112055 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:02.112024 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-69dfb7f775-96s5b"] Apr 16 13:25:03.883176 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:25:03.883135 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eac0928-88b6-48f1-ade6-2044e25388ea" path="/var/lib/kubelet/pods/2eac0928-88b6-48f1-ade6-2044e25388ea/volumes" Apr 16 13:26:37.839251 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:26:37.839213 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:26:37.843204 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:26:37.843178 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:26:37.844806 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:26:37.844782 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:26:37.848436 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:26:37.848416 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:31:37.875494 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:31:37.875447 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:31:37.879363 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:31:37.879336 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:31:37.881283 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:31:37.881259 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:31:37.885654 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:31:37.885635 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:36:37.913307 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:36:37.913275 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:36:37.917172 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:36:37.917152 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:36:37.919561 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:36:37.919540 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:36:37.923228 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:36:37.923212 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:41:37.949239 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:41:37.949211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:41:37.952962 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:41:37.952938 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:41:37.957755 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:41:37.957737 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:41:37.961344 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:41:37.961325 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:43:47.184642 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:43:47.184608 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5f7fc7cc9d-nl827_1378e342-70ff-4ace-bcc1-e6833128a3d5/authorino/0.log" Apr 16 13:44:13.202800 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:13.202764 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-579b4f9c55-97k59_481ab357-361c-40b4-9b1a-7666d917ce71/manager/0.log" Apr 16 13:44:13.675367 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:13.675342 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5889847794-k6jmm_1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2/manager/0.log" Apr 16 13:44:15.744947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:15.744920 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj_e95696cd-e809-4a69-9157-868b6467640a/util/0.log" Apr 16 13:44:15.753653 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:15.753629 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj_e95696cd-e809-4a69-9157-868b6467640a/pull/0.log" Apr 16 13:44:15.761467 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:15.761428 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj_e95696cd-e809-4a69-9157-868b6467640a/extract/0.log" Apr 16 13:44:16.055361 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.055283 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4_aaca4958-bbc5-4859-8e1a-aff64b58930f/extract/0.log" Apr 16 13:44:16.062614 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.062592 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4_aaca4958-bbc5-4859-8e1a-aff64b58930f/util/0.log" Apr 16 13:44:16.068697 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.068670 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4_aaca4958-bbc5-4859-8e1a-aff64b58930f/pull/0.log" Apr 16 13:44:16.234295 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.234262 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs_2dc9b953-175a-4ee0-b859-e7b0e9d03c19/util/0.log" Apr 16 13:44:16.240186 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.240163 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs_2dc9b953-175a-4ee0-b859-e7b0e9d03c19/pull/0.log" Apr 16 13:44:16.248550 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.248528 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs_2dc9b953-175a-4ee0-b859-e7b0e9d03c19/extract/0.log" Apr 16 13:44:16.381502 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.381400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9_29e9d28a-1ec6-47ae-ad10-b068e096acd7/pull/0.log" Apr 16 13:44:16.405212 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.405184 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9_29e9d28a-1ec6-47ae-ad10-b068e096acd7/extract/0.log" Apr 16 13:44:16.417998 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.417978 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9_29e9d28a-1ec6-47ae-ad10-b068e096acd7/util/0.log" Apr 16 13:44:16.574335 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.574304 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5f7fc7cc9d-nl827_1378e342-70ff-4ace-bcc1-e6833128a3d5/authorino/0.log" Apr 16 13:44:16.987879 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:16.987851 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-988vx_27f800cd-853c-4b2b-af56-1a8597700a17/kuadrant-console-plugin/0.log" Apr 16 13:44:18.320479 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:18.320371 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7485ccd7bf-rwxp7_b81c2ff0-12e1-4362-88f7-83c3c2bd95d7/kube-auth-proxy/0.log" Apr 16 13:44:19.140684 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:19.140655 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-qv84k_b91ffcd7-1ce0-4a77-9d1c-e9dced46011c/storage-initializer/0.log" Apr 16 13:44:19.146970 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:19.146950 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-qv84k_b91ffcd7-1ce0-4a77-9d1c-e9dced46011c/main/0.log" Apr 16 13:44:19.421659 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:19.421633 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc_1dea6490-fe12-4060-9f3d-c8dc13995739/storage-initializer/0.log" Apr 16 13:44:19.427816 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:19.427796 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-85fbc_1dea6490-fe12-4060-9f3d-c8dc13995739/main/0.log" Apr 16 13:44:26.490010 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:26.489976 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-d8x8s_a67fb660-e171-411d-96d9-bf4f1692c5d7/global-pull-secret-syncer/0.log" Apr 16 13:44:26.597025 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:26.596997 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-5nl2m_a278bcef-bcae-4a88-80bc-8c4850ab20b1/konnectivity-agent/0.log" Apr 16 13:44:26.670729 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:26.670704 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-242.ec2.internal_5a8bce9435bf9de24fab90842b9d921a/haproxy/0.log" Apr 16 13:44:30.466039 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.466008 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj_e95696cd-e809-4a69-9157-868b6467640a/extract/0.log" Apr 16 13:44:30.514945 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.514914 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj_e95696cd-e809-4a69-9157-868b6467640a/util/0.log" Apr 16 13:44:30.542188 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.542157 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7595bvtj_e95696cd-e809-4a69-9157-868b6467640a/pull/0.log" Apr 16 13:44:30.572246 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.572221 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4_aaca4958-bbc5-4859-8e1a-aff64b58930f/extract/0.log" Apr 16 13:44:30.594506 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.594477 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4_aaca4958-bbc5-4859-8e1a-aff64b58930f/util/0.log" Apr 16 13:44:30.615123 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.615092 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0p4cq4_aaca4958-bbc5-4859-8e1a-aff64b58930f/pull/0.log" Apr 16 13:44:30.638955 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.638929 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs_2dc9b953-175a-4ee0-b859-e7b0e9d03c19/extract/0.log" Apr 16 13:44:30.660132 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.660104 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs_2dc9b953-175a-4ee0-b859-e7b0e9d03c19/util/0.log" Apr 16 13:44:30.689292 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.689264 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73gg8rs_2dc9b953-175a-4ee0-b859-e7b0e9d03c19/pull/0.log" Apr 16 13:44:30.716488 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.716388 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9_29e9d28a-1ec6-47ae-ad10-b068e096acd7/extract/0.log" Apr 16 13:44:30.735241 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.735211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9_29e9d28a-1ec6-47ae-ad10-b068e096acd7/util/0.log" Apr 16 13:44:30.758177 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.758154 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef16q7r9_29e9d28a-1ec6-47ae-ad10-b068e096acd7/pull/0.log" Apr 16 13:44:30.940399 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:30.940368 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-5f7fc7cc9d-nl827_1378e342-70ff-4ace-bcc1-e6833128a3d5/authorino/0.log" Apr 16 13:44:31.031100 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:31.031020 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-988vx_27f800cd-853c-4b2b-af56-1a8597700a17/kuadrant-console-plugin/0.log" Apr 16 13:44:32.724928 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:32.724897 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_40fdff14-28a4-49a1-bd45-05f59bae4cbf/alertmanager/0.log" Apr 16 13:44:32.744231 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:32.744187 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_40fdff14-28a4-49a1-bd45-05f59bae4cbf/config-reloader/0.log" Apr 16 13:44:32.768829 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:32.768810 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_40fdff14-28a4-49a1-bd45-05f59bae4cbf/kube-rbac-proxy-web/0.log" Apr 16 13:44:32.793322 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:32.793296 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_40fdff14-28a4-49a1-bd45-05f59bae4cbf/kube-rbac-proxy/0.log" Apr 16 13:44:32.816891 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:32.816866 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_40fdff14-28a4-49a1-bd45-05f59bae4cbf/kube-rbac-proxy-metric/0.log" Apr 16 13:44:32.843153 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:32.843106 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_40fdff14-28a4-49a1-bd45-05f59bae4cbf/prom-label-proxy/0.log" Apr 16 13:44:32.866491 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:32.866463 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_40fdff14-28a4-49a1-bd45-05f59bae4cbf/init-config-reloader/0.log" Apr 16 13:44:33.042859 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.042780 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-lnb6w_4db29eaf-1248-4f16-9c89-f07661408149/monitoring-plugin/0.log" Apr 16 13:44:33.208838 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.208806 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rqxwg_abf89fc6-7379-4a8b-b65c-7d612bfa9e16/node-exporter/0.log" Apr 16 13:44:33.226437 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.226413 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rqxwg_abf89fc6-7379-4a8b-b65c-7d612bfa9e16/kube-rbac-proxy/0.log" Apr 16 13:44:33.243554 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.243529 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-rqxwg_abf89fc6-7379-4a8b-b65c-7d612bfa9e16/init-textfile/0.log" Apr 16 13:44:33.281035 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.281007 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-8dckj_5feeb15e-917e-4399-bb74-32d60a57e6a9/kube-rbac-proxy-main/0.log" Apr 16 13:44:33.308630 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.308543 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-8dckj_5feeb15e-917e-4399-bb74-32d60a57e6a9/kube-rbac-proxy-self/0.log" Apr 16 13:44:33.329775 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.329748 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-8dckj_5feeb15e-917e-4399-bb74-32d60a57e6a9/openshift-state-metrics/0.log" Apr 16 13:44:33.590300 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.590221 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-9mprx_8be2c403-d308-4c1d-83f1-374a4c40788d/prometheus-operator-admission-webhook/0.log" Apr 16 13:44:33.693811 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.693783 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f878f577b-4q66m_cfb03555-ea15-4b8c-a044-56687ec8276a/thanos-query/0.log" Apr 16 13:44:33.712128 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.712096 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f878f577b-4q66m_cfb03555-ea15-4b8c-a044-56687ec8276a/kube-rbac-proxy-web/0.log" Apr 16 13:44:33.729358 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.729333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f878f577b-4q66m_cfb03555-ea15-4b8c-a044-56687ec8276a/kube-rbac-proxy/0.log" Apr 16 13:44:33.749377 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.749355 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f878f577b-4q66m_cfb03555-ea15-4b8c-a044-56687ec8276a/prom-label-proxy/0.log" Apr 16 13:44:33.779377 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.779351 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f878f577b-4q66m_cfb03555-ea15-4b8c-a044-56687ec8276a/kube-rbac-proxy-rules/0.log" Apr 16 13:44:33.797219 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:33.797194 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7f878f577b-4q66m_cfb03555-ea15-4b8c-a044-56687ec8276a/kube-rbac-proxy-metrics/0.log" Apr 16 13:44:34.890469 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:34.890406 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-szf7m_19d2435c-5afe-472e-90fc-b47d7e20c7ca/networking-console-plugin/0.log" Apr 16 13:44:35.282262 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.282228 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h"] Apr 16 13:44:35.282661 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.282648 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2eac0928-88b6-48f1-ade6-2044e25388ea" containerName="authorino" Apr 16 13:44:35.282713 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.282663 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eac0928-88b6-48f1-ade6-2044e25388ea" containerName="authorino" Apr 16 13:44:35.282749 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.282729 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2eac0928-88b6-48f1-ade6-2044e25388ea" containerName="authorino" Apr 16 13:44:35.285947 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.285929 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.288725 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.288700 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gbjl6\"/\"default-dockercfg-tfgq9\"" Apr 16 13:44:35.288874 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.288703 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbjl6\"/\"openshift-service-ca.crt\"" Apr 16 13:44:35.290227 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.290210 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbjl6\"/\"kube-root-ca.crt\"" Apr 16 13:44:35.295389 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.295367 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h"] Apr 16 13:44:35.325065 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.325028 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-proc\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.325270 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.325096 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-lib-modules\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.325270 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.325129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-sys\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.325270 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.325157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-podres\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.325270 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.325211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x9l2\" (UniqueName: \"kubernetes.io/projected/96eb0aae-0cde-4dd0-ac30-111f2daa387d-kube-api-access-6x9l2\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.425868 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.425840 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-lib-modules\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.426041 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.425876 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-sys\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.426041 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.425893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-podres\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.426041 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.425919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6x9l2\" (UniqueName: \"kubernetes.io/projected/96eb0aae-0cde-4dd0-ac30-111f2daa387d-kube-api-access-6x9l2\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.426041 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.425975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-proc\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.426041 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.425987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-sys\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.426041 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.426020 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-podres\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.426041 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.426022 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-lib-modules\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.426041 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.426041 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/96eb0aae-0cde-4dd0-ac30-111f2daa387d-proc\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.432821 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.432798 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/2.log" Apr 16 13:44:35.434487 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.434436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x9l2\" (UniqueName: \"kubernetes.io/projected/96eb0aae-0cde-4dd0-ac30-111f2daa387d-kube-api-access-6x9l2\") pod \"perf-node-gather-daemonset-mvm8h\" (UID: \"96eb0aae-0cde-4dd0-ac30-111f2daa387d\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.437074 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.437053 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-ppgg7_53d7350f-ab9a-4dba-a9fe-1a1980008f10/console-operator/3.log" Apr 16 13:44:35.597873 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.597772 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.725784 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.725761 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h"] Apr 16 13:44:35.728359 ip-10-0-137-242 kubenswrapper[2574]: W0416 13:44:35.728329 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod96eb0aae_0cde_4dd0_ac30_111f2daa387d.slice/crio-ed18cc40aacb0f41c960bc0d361a79503fddf1f85020a568c9397c5f99ba00d8 WatchSource:0}: Error finding container ed18cc40aacb0f41c960bc0d361a79503fddf1f85020a568c9397c5f99ba00d8: Status 404 returned error can't find the container with id ed18cc40aacb0f41c960bc0d361a79503fddf1f85020a568c9397c5f99ba00d8 Apr 16 13:44:35.730281 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.730265 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:44:35.934409 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.934386 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9c44f7d9c-2hwkw_6e1553e0-6e76-4f1d-bf40-5bdeb970c118/console/0.log" Apr 16 13:44:35.941878 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.941844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" event={"ID":"96eb0aae-0cde-4dd0-ac30-111f2daa387d","Type":"ContainerStarted","Data":"d61815ac8ff7a3fec3e7f39ae45870f7341dc7fe10027b24853659c686d29040"} Apr 16 13:44:35.942038 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.941889 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:35.942038 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.941904 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" event={"ID":"96eb0aae-0cde-4dd0-ac30-111f2daa387d","Type":"ContainerStarted","Data":"ed18cc40aacb0f41c960bc0d361a79503fddf1f85020a568c9397c5f99ba00d8"} Apr 16 13:44:35.959054 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.959010 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" podStartSLOduration=0.958996858 podStartE2EDuration="958.996858ms" podCreationTimestamp="2026-04-16 13:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:44:35.957064503 +0000 UTC m=+1978.646996319" watchObservedRunningTime="2026-04-16 13:44:35.958996858 +0000 UTC m=+1978.648928673" Apr 16 13:44:35.964052 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:35.964034 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-t7qss_ace8c1d3-a5ca-44e1-aeb7-2c82fe2b253e/download-server/0.log" Apr 16 13:44:36.504328 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:36.504299 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-n6nwm_b5ad3b81-d3af-41e5-9cd8-ee5f402461da/volume-data-source-validator/0.log" Apr 16 13:44:37.311100 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:37.311055 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5ghzg_31b5bb3a-e40c-4101-9afe-159f4de5432f/dns/0.log" Apr 16 13:44:37.329323 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:37.329294 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5ghzg_31b5bb3a-e40c-4101-9afe-159f4de5432f/kube-rbac-proxy/0.log" Apr 16 13:44:37.446116 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:37.446091 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-rcb6c_2f6415fd-c655-4953-99ef-9575b096b3e2/dns-node-resolver/0.log" Apr 16 13:44:37.983757 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:37.983728 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-j7cl9_89d8b508-8fec-42a0-b382-7b625093ae71/node-ca/0.log" Apr 16 13:44:38.945422 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:38.945395 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7485ccd7bf-rwxp7_b81c2ff0-12e1-4362-88f7-83c3c2bd95d7/kube-auth-proxy/0.log" Apr 16 13:44:39.569549 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:39.569520 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vgz9p_de85bd7a-e5fd-4893-9f0b-d98d2bc4c225/serve-healthcheck-canary/0.log" Apr 16 13:44:40.093833 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:40.093807 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4s7w4_a2e1eeca-0df2-4cf5-b682-837d6e38c6fe/kube-rbac-proxy/0.log" Apr 16 13:44:40.110572 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:40.110545 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4s7w4_a2e1eeca-0df2-4cf5-b682-837d6e38c6fe/exporter/0.log" Apr 16 13:44:40.128601 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:40.128570 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4s7w4_a2e1eeca-0df2-4cf5-b682-837d6e38c6fe/extractor/0.log" Apr 16 13:44:41.956109 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:41.956082 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-mvm8h" Apr 16 13:44:42.171951 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:42.171919 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-579b4f9c55-97k59_481ab357-361c-40b4-9b1a-7666d917ce71/manager/0.log" Apr 16 13:44:42.261592 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:42.261513 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5889847794-k6jmm_1c3f9ca4-b8e1-4737-a5e5-e3b1115f17f2/manager/0.log" Apr 16 13:44:43.529197 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:43.529172 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-64f4647cd-6vs9l_1d3dd9f4-e5b4-4733-af71-8c614aa6cb16/manager/0.log" Apr 16 13:44:43.573590 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:43.573561 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-zqq82_189eea4b-3b7e-4a55-a80e-4b5b36a2bdac/openshift-lws-operator/0.log" Apr 16 13:44:48.027993 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:48.027957 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-8hgp4_87dd2329-65a9-4de8-991f-e8d66652bcff/migrator/0.log" Apr 16 13:44:48.048470 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:48.048427 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-8hgp4_87dd2329-65a9-4de8-991f-e8d66652bcff/graceful-termination/0.log" Apr 16 13:44:48.398470 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:48.398373 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-nth8m_6ea21a67-8607-4bf1-9fdc-dcdefcaf7121/kube-storage-version-migrator-operator/1.log" Apr 16 13:44:48.399171 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:48.399150 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-nth8m_6ea21a67-8607-4bf1-9fdc-dcdefcaf7121/kube-storage-version-migrator-operator/0.log" Apr 16 13:44:49.556810 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.556782 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcmf6_de1ece58-c2c4-43a2-9674-0ba69c468ab3/kube-multus-additional-cni-plugins/0.log" Apr 16 13:44:49.574816 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.574792 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcmf6_de1ece58-c2c4-43a2-9674-0ba69c468ab3/egress-router-binary-copy/0.log" Apr 16 13:44:49.592816 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.592790 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcmf6_de1ece58-c2c4-43a2-9674-0ba69c468ab3/cni-plugins/0.log" Apr 16 13:44:49.610298 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.610274 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcmf6_de1ece58-c2c4-43a2-9674-0ba69c468ab3/bond-cni-plugin/0.log" Apr 16 13:44:49.628337 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.628308 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcmf6_de1ece58-c2c4-43a2-9674-0ba69c468ab3/routeoverride-cni/0.log" Apr 16 13:44:49.647196 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.647176 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcmf6_de1ece58-c2c4-43a2-9674-0ba69c468ab3/whereabouts-cni-bincopy/0.log" Apr 16 13:44:49.664898 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.664871 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lcmf6_de1ece58-c2c4-43a2-9674-0ba69c468ab3/whereabouts-cni/0.log" Apr 16 13:44:49.827815 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.827738 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h97bc_0f795d25-9d1f-4d82-807d-5a1c84b53f53/kube-multus/0.log" Apr 16 13:44:49.983514 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.983483 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-shf7m_aca4cb2c-c97a-48d4-a12d-3316ad904fc3/network-metrics-daemon/0.log" Apr 16 13:44:49.999324 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:49.999300 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-shf7m_aca4cb2c-c97a-48d4-a12d-3316ad904fc3/kube-rbac-proxy/0.log" Apr 16 13:44:51.420506 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:51.420474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-controller/0.log" Apr 16 13:44:51.438869 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:51.438838 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/0.log" Apr 16 13:44:51.450648 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:51.450562 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovn-acl-logging/1.log" Apr 16 13:44:51.470841 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:51.470816 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/kube-rbac-proxy-node/0.log" Apr 16 13:44:51.488251 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:51.488227 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 13:44:51.503318 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:51.503292 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/northd/0.log" Apr 16 13:44:51.520727 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:51.520694 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/nbdb/0.log" Apr 16 13:44:51.538928 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:51.538904 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/sbdb/0.log" Apr 16 13:44:51.634450 ip-10-0-137-242 kubenswrapper[2574]: I0416 13:44:51.634417 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xp25r_abe3d854-132c-4011-9672-f19aca1e1f71/ovnkube-controller/0.log"