Apr 16 23:26:32.262808 ip-10-0-136-153 systemd[1]: Starting Kubernetes Kubelet... Apr 16 23:26:32.781355 ip-10-0-136-153 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:26:32.781885 ip-10-0-136-153 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 23:26:32.781885 ip-10-0-136-153 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:26:32.781885 ip-10-0-136-153 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 23:26:32.781885 ip-10-0-136-153 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 23:26:32.783621 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.783524 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 23:26:32.785902 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785881 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:26:32.785902 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785897 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:26:32.785902 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785902 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:26:32.785902 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785906 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785911 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785915 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785919 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785925 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785928 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785932 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785936 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785940 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785944 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785948 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785952 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785956 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785960 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785964 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785970 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785976 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785980 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785984 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:26:32.786139 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.785991 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786008 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786013 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786017 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786022 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786026 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786030 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786035 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786039 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786044 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786048 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786052 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786056 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786061 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786074 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786082 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786087 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786091 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786096 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:26:32.786955 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786100 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786106 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786110 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786115 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786119 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786123 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786127 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786131 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786135 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786139 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786143 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786147 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786151 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786155 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786159 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786164 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786169 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786173 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786177 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786182 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:26:32.787718 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786186 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786190 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786194 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786198 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786202 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786206 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786211 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786215 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786223 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786227 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786232 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786236 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786240 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786245 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786250 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786254 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786258 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786262 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786266 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786271 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:26:32.788217 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786275 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786279 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786283 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786287 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786292 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786922 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786932 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786936 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786941 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786945 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786950 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786954 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786958 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786962 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786966 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786970 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786974 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786978 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786982 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786987 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:26:32.788769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786992 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.786997 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787002 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787006 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787010 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787015 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787020 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787025 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787029 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787033 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787037 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787041 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787045 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787049 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787053 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787057 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787061 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787066 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787070 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:26:32.789627 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787074 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787081 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787087 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787092 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787096 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787100 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787104 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787108 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787112 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787117 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787121 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787126 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787131 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787135 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787140 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787146 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787151 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787155 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787159 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787163 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:26:32.790382 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787168 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787172 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787176 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787180 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787185 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787189 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787193 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787197 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787202 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787206 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787211 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787215 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787219 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787223 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787227 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787231 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787236 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787240 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787244 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787248 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:26:32.790970 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787252 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787257 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787261 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787265 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787270 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787274 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787278 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787282 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787287 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787291 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787299 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.787304 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788787 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788807 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788819 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788826 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788833 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788839 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788847 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788854 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788860 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 23:26:32.791564 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788865 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788870 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788875 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788880 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788885 2573 flags.go:64] FLAG: --cgroup-root="" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788890 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788895 2573 flags.go:64] FLAG: --client-ca-file="" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788899 2573 flags.go:64] FLAG: --cloud-config="" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788904 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788909 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788915 2573 flags.go:64] FLAG: --cluster-domain="" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788919 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788925 2573 flags.go:64] FLAG: --config-dir="" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788929 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788934 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788940 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788945 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788950 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788955 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788960 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788964 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788970 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788975 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788979 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788986 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 23:26:32.792303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788992 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.788997 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789002 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789007 2573 flags.go:64] FLAG: --enable-server="true" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789012 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789019 2573 flags.go:64] FLAG: --event-burst="100" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789024 2573 flags.go:64] FLAG: --event-qps="50" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789029 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789035 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789039 2573 flags.go:64] FLAG: --eviction-hard="" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789053 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789058 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789063 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789068 2573 flags.go:64] FLAG: --eviction-soft="" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789072 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789077 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789082 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789087 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789092 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789096 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789101 2573 flags.go:64] FLAG: --feature-gates="" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789107 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789112 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789117 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789122 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789127 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 16 23:26:32.792944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789132 2573 flags.go:64] FLAG: --help="false" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789137 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-136-153.ec2.internal" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789142 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789147 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789152 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789157 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789163 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789169 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789174 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789179 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789184 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789189 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789195 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789200 2573 flags.go:64] FLAG: --kube-reserved="" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789205 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789210 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789215 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789220 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789224 2573 flags.go:64] FLAG: --lock-file="" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789229 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789234 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789239 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789248 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 23:26:32.793684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789253 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789258 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789262 2573 flags.go:64] FLAG: --logging-format="text" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789267 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789272 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789277 2573 flags.go:64] FLAG: --manifest-url="" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789282 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789289 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789294 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789300 2573 flags.go:64] FLAG: --max-pods="110" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789305 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789310 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789315 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789320 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789325 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789330 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789335 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789346 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789351 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789356 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789361 2573 flags.go:64] FLAG: --pod-cidr="" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789366 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789375 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789380 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 23:26:32.794244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789385 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789390 2573 flags.go:64] FLAG: --port="10250" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789394 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789399 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0808c828090d9f2c1" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789404 2573 flags.go:64] FLAG: --qos-reserved="" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789409 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789414 2573 flags.go:64] FLAG: --register-node="true" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789419 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789424 2573 flags.go:64] FLAG: --register-with-taints="" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789429 2573 flags.go:64] FLAG: --registry-burst="10" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789434 2573 flags.go:64] FLAG: --registry-qps="5" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789438 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789443 2573 flags.go:64] FLAG: --reserved-memory="" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789449 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789454 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789459 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789464 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789469 2573 flags.go:64] FLAG: --runonce="false" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789473 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789478 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789483 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789487 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789492 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789497 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789502 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789507 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 23:26:32.794843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789512 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789522 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789526 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789548 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789554 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789559 2573 flags.go:64] FLAG: --system-cgroups="" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789563 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789572 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789577 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789582 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789588 2573 flags.go:64] FLAG: --tls-min-version="" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789593 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789597 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789602 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789607 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789612 2573 flags.go:64] FLAG: --v="2" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789618 2573 flags.go:64] FLAG: --version="false" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789625 2573 flags.go:64] FLAG: --vmodule="" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789631 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.789636 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789782 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789788 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789794 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789799 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:26:32.795463 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789803 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789808 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789812 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789816 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789820 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789825 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789829 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789834 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789838 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789845 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789849 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789855 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789859 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789864 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789868 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789873 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789877 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789881 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789885 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789890 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:26:32.796079 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789894 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789898 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789902 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789906 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789910 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789914 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789918 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789922 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789926 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789931 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789934 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789938 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789942 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789946 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789951 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789955 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789959 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789963 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789967 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789972 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:26:32.796615 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789976 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789981 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789987 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789993 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.789997 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790001 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790005 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790009 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790013 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790018 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790022 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790026 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790030 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790036 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790042 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790048 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790052 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790056 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790061 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:26:32.797108 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790065 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790070 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790074 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790078 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790082 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790086 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790090 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790094 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790099 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790103 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790109 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790114 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790119 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790123 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790129 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790134 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790141 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790145 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790152 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:26:32.797586 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790157 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790163 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790167 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.790172 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.791191 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.797514 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.797530 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797593 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797599 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797602 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797605 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797608 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797611 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797614 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797617 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797620 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:26:32.798049 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797623 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797626 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797629 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797631 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797634 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797636 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797639 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797642 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797644 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797646 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797650 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797653 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797655 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797658 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797661 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797663 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797666 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797668 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797671 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797674 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:26:32.798446 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797676 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797679 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797683 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797685 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797688 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797690 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797693 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797696 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797699 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797701 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797704 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797706 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797709 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797711 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797714 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797717 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797720 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797722 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797725 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797727 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:26:32.798948 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797730 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797734 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797739 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797742 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797745 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797748 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797750 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797753 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797756 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797758 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797761 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797763 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797766 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797768 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797771 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797774 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797777 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797780 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797782 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:26:32.799443 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797785 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797787 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797790 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797792 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797795 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797798 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797800 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797803 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797805 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797808 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797811 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797813 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797816 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797820 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797823 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797826 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797829 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:26:32.799985 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797831 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.797836 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797935 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797940 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797943 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797946 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797949 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797952 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797954 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797957 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797960 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797963 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797966 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797969 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797971 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 23:26:32.800404 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797974 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797976 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797979 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797982 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797984 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797987 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797990 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797993 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797995 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.797998 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798001 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798004 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798006 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798009 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798012 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798014 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798017 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798020 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798022 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798025 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 23:26:32.800797 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798027 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798030 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798034 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798037 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798040 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798043 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798046 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798048 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798051 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798054 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798057 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798060 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798062 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798065 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798067 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798070 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798072 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798075 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798077 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 23:26:32.801288 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798080 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798083 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798085 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798087 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798090 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798092 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798095 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798097 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798100 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798102 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798105 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798107 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798110 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798112 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798114 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798117 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798119 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798122 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798124 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798127 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 23:26:32.801769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798129 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798133 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798137 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798140 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798143 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798146 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798148 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798151 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798153 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798155 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798158 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798160 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798163 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:32.798167 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.798173 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 23:26:32.802246 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.798970 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 23:26:32.802844 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.802830 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 23:26:32.803889 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.803878 2573 server.go:1019] "Starting client certificate rotation" Apr 16 23:26:32.803992 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.803976 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:26:32.804022 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.804014 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 23:26:32.834072 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.834053 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:26:32.841956 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.841925 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 23:26:32.855449 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.855433 2573 log.go:25] "Validated CRI v1 runtime API" Apr 16 23:26:32.861573 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.861558 2573 log.go:25] "Validated CRI v1 image API" Apr 16 23:26:32.863588 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.863573 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 23:26:32.866196 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.866176 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:26:32.866850 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.866831 2573 fs.go:135] Filesystem UUIDs: map[6df2bbd3-90d7-44c6-b2b5-e6ea653ede28:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8dd3cb4b-9992-463d-9061-0755bb0175d0:/dev/nvme0n1p4] Apr 16 23:26:32.866898 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.866850 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 23:26:32.871910 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.871803 2573 manager.go:217] Machine: {Timestamp:2026-04-16 23:26:32.870522768 +0000 UTC m=+0.472441861 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096779 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a2a4aba1c99c603d3f794d835c910 SystemUUID:ec2a2a4a-ba1c-99c6-03d3-f794d835c910 BootID:74df5ca0-7881-4dd2-9161-e45c3b70e6e8 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:47:38:e9:53:1d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:47:38:e9:53:1d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:45:a6:32:a5:ce Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 23:26:32.871910 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.871906 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 23:26:32.872096 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.871999 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 23:26:32.874660 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.874633 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 23:26:32.874795 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.874662 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-153.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 23:26:32.874836 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.874805 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 23:26:32.874836 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.874814 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 23:26:32.874836 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.874826 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:26:32.875723 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.875713 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 23:26:32.877979 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.877969 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:26:32.878090 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.878081 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 23:26:32.880594 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.880583 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 16 23:26:32.880626 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.880603 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 23:26:32.880626 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.880616 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 23:26:32.880626 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.880625 2573 kubelet.go:397] "Adding apiserver pod source" Apr 16 23:26:32.880737 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.880636 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 23:26:32.882001 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.881989 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:26:32.882043 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.882007 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 23:26:32.885857 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.885809 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8w9x9" Apr 16 23:26:32.886902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.886883 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 23:26:32.888709 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.888694 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 23:26:32.890679 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890668 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 23:26:32.890716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890685 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 23:26:32.890716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890691 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 23:26:32.890716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890696 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 23:26:32.890716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890701 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 23:26:32.890716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890706 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 23:26:32.890716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890713 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 23:26:32.890716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890719 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 23:26:32.890900 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890727 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 23:26:32.890900 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890733 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 23:26:32.890900 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890741 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 23:26:32.890900 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.890750 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 23:26:32.891799 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.891791 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 23:26:32.891799 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.891800 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 23:26:32.892779 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.892761 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8w9x9" Apr 16 23:26:32.894420 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:32.894398 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-153.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 23:26:32.894504 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.894454 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-153.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 23:26:32.894504 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:32.894466 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 23:26:32.895087 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.895076 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 23:26:32.895123 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.895110 2573 server.go:1295] "Started kubelet" Apr 16 23:26:32.895235 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.895194 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 23:26:32.895297 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.895246 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 23:26:32.895775 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.895742 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 23:26:32.895824 ip-10-0-136-153 systemd[1]: Started Kubernetes Kubelet. Apr 16 23:26:32.896888 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.896872 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 23:26:32.898459 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.898444 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 16 23:26:32.902213 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.902195 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 23:26:32.902514 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.902493 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 23:26:32.903329 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:32.903297 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:32.903420 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.903330 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 23:26:32.903420 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.903372 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 23:26:32.903420 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.903389 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 23:26:32.903873 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.903855 2573 factory.go:55] Registering systemd factory Apr 16 23:26:32.903873 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.903871 2573 factory.go:223] Registration of the systemd container factory successfully Apr 16 23:26:32.904053 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.904041 2573 factory.go:153] Registering CRI-O factory Apr 16 23:26:32.904104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.904056 2573 factory.go:223] Registration of the crio container factory successfully Apr 16 23:26:32.904155 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.904108 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 23:26:32.904155 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.904128 2573 factory.go:103] Registering Raw factory Apr 16 23:26:32.904155 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.904144 2573 manager.go:1196] Started watching for new ooms in manager Apr 16 23:26:32.904411 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.904397 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 16 23:26:32.904518 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.904504 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 16 23:26:32.905247 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.905086 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:32.905313 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.905275 2573 manager.go:319] Starting recovery of all containers Apr 16 23:26:32.905973 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:32.905946 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 23:26:32.907547 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:32.907513 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-136-153.ec2.internal\" not found" node="ip-10-0-136-153.ec2.internal" Apr 16 23:26:32.917158 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.917140 2573 manager.go:324] Recovery completed Apr 16 23:26:32.921436 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.921424 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:32.923394 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.923382 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:32.923452 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.923405 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:32.923452 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.923414 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:32.923891 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.923879 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 23:26:32.923932 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.923891 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 23:26:32.923932 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.923910 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 16 23:26:32.926470 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.926459 2573 policy_none.go:49] "None policy: Start" Apr 16 23:26:32.926508 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.926475 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 23:26:32.926508 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.926486 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 16 23:26:32.968096 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.968080 2573 manager.go:341] "Starting Device Plugin manager" Apr 16 23:26:32.971067 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:32.968136 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 23:26:32.971067 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.968149 2573 server.go:85] "Starting device plugin registration server" Apr 16 23:26:32.971067 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.968360 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 23:26:32.971067 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.968372 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 23:26:32.971067 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.968469 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 23:26:32.971067 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.968566 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 23:26:32.971067 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:32.968573 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 23:26:32.971067 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:32.969137 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 23:26:32.971067 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:32.969163 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:33.052280 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.052231 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 23:26:33.053387 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.053370 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 23:26:33.053445 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.053401 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 23:26:33.053445 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.053420 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 23:26:33.053445 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.053428 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 23:26:33.053581 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.053463 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 23:26:33.055670 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.055651 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:33.069360 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.069342 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:33.070176 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.070154 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:33.070176 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.070179 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:33.070302 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.070193 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:33.070302 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.070216 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.079013 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.078993 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.079101 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.079016 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-153.ec2.internal\": node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:33.097804 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.097783 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:33.153667 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.153636 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal"] Apr 16 23:26:33.153736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.153700 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:33.154421 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.154406 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:33.154498 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.154429 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:33.154498 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.154448 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:33.155523 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.155510 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:33.155659 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.155644 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.155735 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.155678 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:33.156178 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.156158 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:33.156249 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.156185 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:33.156249 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.156194 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:33.156249 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.156160 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:33.156342 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.156256 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:33.156342 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.156265 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:33.159708 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.159690 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.159770 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.159727 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 23:26:33.160363 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.160342 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientMemory" Apr 16 23:26:33.160448 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.160374 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 23:26:33.160448 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.160389 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeHasSufficientPID" Apr 16 23:26:33.171038 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.171022 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-153.ec2.internal\" not found" node="ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.176201 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.176186 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-153.ec2.internal\" not found" node="ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.198797 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.198781 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:33.207236 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.207220 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be4647434e0fd79acaa8b46b7c163d65-config\") pod \"kube-apiserver-proxy-ip-10-0-136-153.ec2.internal\" (UID: \"be4647434e0fd79acaa8b46b7c163d65\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.207314 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.207249 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0eb09bb6acbd755bc18c097296960fb1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal\" (UID: \"0eb09bb6acbd755bc18c097296960fb1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.207314 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.207267 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0eb09bb6acbd755bc18c097296960fb1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal\" (UID: \"0eb09bb6acbd755bc18c097296960fb1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.299761 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.299734 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:33.308164 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.308119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be4647434e0fd79acaa8b46b7c163d65-config\") pod \"kube-apiserver-proxy-ip-10-0-136-153.ec2.internal\" (UID: \"be4647434e0fd79acaa8b46b7c163d65\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.308164 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.308150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0eb09bb6acbd755bc18c097296960fb1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal\" (UID: \"0eb09bb6acbd755bc18c097296960fb1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.308288 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.308176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0eb09bb6acbd755bc18c097296960fb1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal\" (UID: \"0eb09bb6acbd755bc18c097296960fb1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.308288 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.308210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0eb09bb6acbd755bc18c097296960fb1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal\" (UID: \"0eb09bb6acbd755bc18c097296960fb1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.308288 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.308210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be4647434e0fd79acaa8b46b7c163d65-config\") pod \"kube-apiserver-proxy-ip-10-0-136-153.ec2.internal\" (UID: \"be4647434e0fd79acaa8b46b7c163d65\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.308288 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.308211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0eb09bb6acbd755bc18c097296960fb1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal\" (UID: \"0eb09bb6acbd755bc18c097296960fb1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.400526 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.400501 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:33.472982 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.472952 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.478594 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.478577 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.500986 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.500961 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:33.601570 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.601518 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:33.702019 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.701999 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-153.ec2.internal\" not found" Apr 16 23:26:33.787808 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.787783 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:33.802642 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.802618 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.803151 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.803133 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 23:26:33.803279 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.803262 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:26:33.803334 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.803262 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:26:33.803334 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.803295 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 23:26:33.818148 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.818126 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:26:33.819868 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.819856 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" Apr 16 23:26:33.829575 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.829559 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 23:26:33.880821 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.880778 2573 apiserver.go:52] "Watching apiserver" Apr 16 23:26:33.890805 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.890788 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 23:26:33.891145 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.891124 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl","openshift-cluster-node-tuning-operator/tuned-ppndn","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal","openshift-multus/multus-additional-cni-plugins-wz8rl","openshift-multus/multus-vr4xk","openshift-ovn-kubernetes/ovnkube-node-5pn57","kube-system/konnectivity-agent-4np6h","openshift-dns/node-resolver-m8gnt","openshift-image-registry/node-ca-kt5t6","openshift-multus/network-metrics-daemon-qhz5v","openshift-network-diagnostics/network-check-target-svp68","openshift-network-operator/iptables-alerter-869hf"] Apr 16 23:26:33.892843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.892826 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:33.894326 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.894302 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 23:21:32 +0000 UTC" deadline="2027-11-19 01:04:12.821580629 +0000 UTC" Apr 16 23:26:33.894326 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.894324 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13945h37m38.927259242s" Apr 16 23:26:33.894660 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.894626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.894836 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.894817 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 23:26:33.895069 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.895052 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4d7bt\"" Apr 16 23:26:33.895069 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.895066 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 23:26:33.895201 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.895067 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 23:26:33.896063 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.896046 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:33.896566 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.896531 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-q6zqf\"" Apr 16 23:26:33.896566 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.896563 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 23:26:33.896722 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.896548 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:26:33.897386 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.897370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.897866 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.897849 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 23:26:33.897962 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.897910 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 23:26:33.898244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.898228 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wwdv4\"" Apr 16 23:26:33.898358 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.898347 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 23:26:33.898510 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.898408 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 23:26:33.898510 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.898443 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 23:26:33.898841 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.898825 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.898995 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.898972 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 23:26:33.899418 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.899404 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rq8fl\"" Apr 16 23:26:33.900596 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.900580 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:33.900994 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.900976 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 23:26:33.901097 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.901005 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 23:26:33.901097 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.901013 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mvt7v\"" Apr 16 23:26:33.901097 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.901039 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 23:26:33.901097 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.901042 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 23:26:33.901097 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.901070 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 23:26:33.901271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.901013 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 23:26:33.901894 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.901879 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:33.902469 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.902454 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rbrlz\"" Apr 16 23:26:33.902598 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.902586 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 23:26:33.902785 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.902772 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 23:26:33.902955 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.902938 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 23:26:33.903152 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.903132 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:33.903675 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.903660 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-q2d9s\"" Apr 16 23:26:33.903890 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.903866 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 23:26:33.904327 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.904311 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 23:26:33.904637 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.904617 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:33.905494 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.905056 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:33.905494 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.905296 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-pkmwl\"" Apr 16 23:26:33.905494 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.905321 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 23:26:33.905494 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.905349 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 23:26:33.905494 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.905357 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 23:26:33.907431 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.907409 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:33.907819 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:33.907794 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:33.909131 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.909117 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:33.911197 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911178 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:26:33.911197 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911188 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 23:26:33.911338 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911201 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 23:26:33.911338 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911290 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7lzd4\"" Apr 16 23:26:33.911338 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c864774-d6f1-4d07-8798-126023861e55-ovn-node-metrics-cert\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.911489 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911344 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e54684e-7b38-446d-a750-4bb17e3d69b0-cni-binary-copy\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.911489 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911386 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-sysconfig\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.911489 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911419 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-run-systemd\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.911489 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911444 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-node-log\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.911489 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911468 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-etc-openvswitch\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.911736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911492 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-run-ovn\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.911736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911515 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a40691e1-4691-4b8d-b935-ff781629806d-tmp-dir\") pod \"node-resolver-m8gnt\" (UID: \"a40691e1-4691-4b8d-b935-ff781629806d\") " pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:33.911736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911558 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9lg9\" (UniqueName: \"kubernetes.io/projected/a40691e1-4691-4b8d-b935-ff781629806d-kube-api-access-h9lg9\") pod \"node-resolver-m8gnt\" (UID: \"a40691e1-4691-4b8d-b935-ff781629806d\") " pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:33.911736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911582 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bhkr\" (UniqueName: \"kubernetes.io/projected/87459b43-0232-4d86-97d1-bb0d563c1ecb-kube-api-access-6bhkr\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:33.911736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911625 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-var-lib-cni-multus\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.911736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk2pm\" (UniqueName: \"kubernetes.io/projected/e24fef17-a7c3-497e-a65f-9458686c8ea2-kube-api-access-hk2pm\") pod \"node-ca-kt5t6\" (UID: \"e24fef17-a7c3-497e-a65f-9458686c8ea2\") " pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:33.911736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911702 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-systemd\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.911736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911731 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-var-lib-openvswitch\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-run-multus-certs\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-sysctl-conf\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911842 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-run-openvswitch\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911909 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jqjc\" (UniqueName: \"kubernetes.io/projected/9c864774-d6f1-4d07-8798-126023861e55-kube-api-access-4jqjc\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.911977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e24fef17-a7c3-497e-a65f-9458686c8ea2-host\") pod \"node-ca-kt5t6\" (UID: \"e24fef17-a7c3-497e-a65f-9458686c8ea2\") " pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912001 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e24fef17-a7c3-497e-a65f-9458686c8ea2-serviceca\") pod \"node-ca-kt5t6\" (UID: \"e24fef17-a7c3-497e-a65f-9458686c8ea2\") " pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912019 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-modprobe-d\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-host\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912074 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-system-cni-dir\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912096 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-os-release\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912118 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-cni-binary-copy\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:33.912143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-system-cni-dir\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912207 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-sys-fs\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912245 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-sys\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912273 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07480989-b282-4bef-8d30-88f1027e6fb5-tmp\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmmws\" (UniqueName: \"kubernetes.io/projected/07480989-b282-4bef-8d30-88f1027e6fb5-kube-api-access-zmmws\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912322 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-cnibin\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912346 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912372 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-kubernetes\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912393 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-run-ovn-kubernetes\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912415 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-cni-netd\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-var-lib-kubelet\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912458 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w9cr\" (UniqueName: \"kubernetes.io/projected/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-kube-api-access-2w9cr\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912496 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-kubelet\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912520 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-etc-selinux\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912552 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-os-release\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912583 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-conf-dir\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.912833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912614 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-daemon-config\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912637 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-log-socket\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912660 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c864774-d6f1-4d07-8798-126023861e55-ovnkube-config\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a40691e1-4691-4b8d-b935-ff781629806d-hosts-file\") pod \"node-resolver-m8gnt\" (UID: \"a40691e1-4691-4b8d-b935-ff781629806d\") " pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912717 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-device-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-cni-dir\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912786 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-run-netns\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912808 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-hostroot\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912830 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a8419133-562e-45db-abac-da560b01e6d9-konnectivity-ca\") pod \"konnectivity-agent-4np6h\" (UID: \"a8419133-562e-45db-abac-da560b01e6d9\") " pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912863 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-registration-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-socket-dir-parent\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xp5b\" (UniqueName: \"kubernetes.io/projected/7e54684e-7b38-446d-a750-4bb17e3d69b0-kube-api-access-5xp5b\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.912976 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c864774-d6f1-4d07-8798-126023861e55-env-overrides\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913006 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-var-lib-cni-bin\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-systemd-units\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-etc-kubernetes\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913083 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c864774-d6f1-4d07-8798-126023861e55-ovnkube-script-lib\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.913662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913112 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-slash\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-run-netns\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913204 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-socket-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-run\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07480989-b282-4bef-8d30-88f1027e6fb5-etc-tuned\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-cnibin\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh869\" (UniqueName: \"kubernetes.io/projected/3ed66159-86cf-4f43-824b-3905a5019c1c-kube-api-access-mh869\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913333 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-sysctl-d\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913360 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-lib-modules\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913383 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-cni-bin\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913435 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-run-k8s-cni-cncf-io\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913473 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a8419133-562e-45db-abac-da560b01e6d9-agent-certs\") pod \"konnectivity-agent-4np6h\" (UID: \"a8419133-562e-45db-abac-da560b01e6d9\") " pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913497 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:33.914214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.913550 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-var-lib-kubelet\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:33.924723 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.924704 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:33.933622 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.933604 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bc6f7" Apr 16 23:26:33.938708 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:33.938689 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bc6f7" Apr 16 23:26:34.005081 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.005059 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 23:26:34.005617 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.005596 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eb09bb6acbd755bc18c097296960fb1.slice/crio-5ce1b0b5c3e0fd97d24c830fa9128e281dfb6965d29785d07548d9d0c83a8b06 WatchSource:0}: Error finding container 5ce1b0b5c3e0fd97d24c830fa9128e281dfb6965d29785d07548d9d0c83a8b06: Status 404 returned error can't find the container with id 5ce1b0b5c3e0fd97d24c830fa9128e281dfb6965d29785d07548d9d0c83a8b06 Apr 16 23:26:34.005853 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.005831 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe4647434e0fd79acaa8b46b7c163d65.slice/crio-e86266ed579c6d0c55758ccff255b97b82a475810d26b0b9945c88eafbe53b03 WatchSource:0}: Error finding container e86266ed579c6d0c55758ccff255b97b82a475810d26b0b9945c88eafbe53b03: Status 404 returned error can't find the container with id e86266ed579c6d0c55758ccff255b97b82a475810d26b0b9945c88eafbe53b03 Apr 16 23:26:34.011766 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.011750 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:26:34.013672 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-systemd-units\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.013747 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-etc-kubernetes\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.013747 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c864774-d6f1-4d07-8798-126023861e55-ovnkube-script-lib\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.013843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013747 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-systemd-units\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.013843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013755 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-slash\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.013843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-run-netns\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.013843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013782 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-etc-kubernetes\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.013843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-run-netns\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.013843 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013836 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-socket-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013862 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-run\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07480989-b282-4bef-8d30-88f1027e6fb5-etc-tuned\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-cnibin\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-slash\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mh869\" (UniqueName: \"kubernetes.io/projected/3ed66159-86cf-4f43-824b-3905a5019c1c-kube-api-access-mh869\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-sysctl-d\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-socket-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.013978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-lib-modules\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014003 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-cni-bin\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014012 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-run\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014051 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-run-k8s-cni-cncf-io\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014072 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a8419133-562e-45db-abac-da560b01e6d9-agent-certs\") pod \"konnectivity-agent-4np6h\" (UID: \"a8419133-562e-45db-abac-da560b01e6d9\") " pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014106 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-sysctl-d\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014108 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014175 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-cni-bin\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014212 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-run-k8s-cni-cncf-io\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-cnibin\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014456 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9c864774-d6f1-4d07-8798-126023861e55-ovnkube-script-lib\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014493 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-var-lib-kubelet\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014509 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-lib-modules\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c864774-d6f1-4d07-8798-126023861e55-ovn-node-metrics-cert\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014592 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-var-lib-kubelet\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.014624 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014626 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e54684e-7b38-446d-a750-4bb17e3d69b0-cni-binary-copy\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.014709 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs podName:3ed66159-86cf-4f43-824b-3905a5019c1c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:34.514664087 +0000 UTC m=+2.116583186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs") pod "network-metrics-daemon-qhz5v" (UID: "3ed66159-86cf-4f43-824b-3905a5019c1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014742 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-sysconfig\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014769 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-run-systemd\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.014810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-node-log\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-run-systemd\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014896 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-etc-openvswitch\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014923 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-sysconfig\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-run-ovn\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a40691e1-4691-4b8d-b935-ff781629806d-tmp-dir\") pod \"node-resolver-m8gnt\" (UID: \"a40691e1-4691-4b8d-b935-ff781629806d\") " pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9lg9\" (UniqueName: \"kubernetes.io/projected/a40691e1-4691-4b8d-b935-ff781629806d-kube-api-access-h9lg9\") pod \"node-resolver-m8gnt\" (UID: \"a40691e1-4691-4b8d-b935-ff781629806d\") " pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015002 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-run-ovn\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bhkr\" (UniqueName: \"kubernetes.io/projected/87459b43-0232-4d86-97d1-bb0d563c1ecb-kube-api-access-6bhkr\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-node-log\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-var-lib-cni-multus\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015074 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk2pm\" (UniqueName: \"kubernetes.io/projected/e24fef17-a7c3-497e-a65f-9458686c8ea2-kube-api-access-hk2pm\") pod \"node-ca-kt5t6\" (UID: \"e24fef17-a7c3-497e-a65f-9458686c8ea2\") " pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015078 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-var-lib-cni-multus\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015107 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b39de8b0-dcb5-4dc5-81e3-2a24820be395-host-slash\") pod \"iptables-alerter-869hf\" (UID: \"b39de8b0-dcb5-4dc5-81e3-2a24820be395\") " pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015119 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7e54684e-7b38-446d-a750-4bb17e3d69b0-cni-binary-copy\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-systemd\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-var-lib-openvswitch\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.015810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-run-multus-certs\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs6gq\" (UniqueName: \"kubernetes.io/projected/b39de8b0-dcb5-4dc5-81e3-2a24820be395-kube-api-access-qs6gq\") pod \"iptables-alerter-869hf\" (UID: \"b39de8b0-dcb5-4dc5-81e3-2a24820be395\") " pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-sysctl-conf\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015295 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015321 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-run-openvswitch\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015348 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015359 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a40691e1-4691-4b8d-b935-ff781629806d-tmp-dir\") pod \"node-resolver-m8gnt\" (UID: \"a40691e1-4691-4b8d-b935-ff781629806d\") " pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jqjc\" (UniqueName: \"kubernetes.io/projected/9c864774-d6f1-4d07-8798-126023861e55-kube-api-access-4jqjc\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.014971 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-etc-openvswitch\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e24fef17-a7c3-497e-a65f-9458686c8ea2-host\") pod \"node-ca-kt5t6\" (UID: \"e24fef17-a7c3-497e-a65f-9458686c8ea2\") " pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015427 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-run-multus-certs\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e24fef17-a7c3-497e-a65f-9458686c8ea2-serviceca\") pod \"node-ca-kt5t6\" (UID: \"e24fef17-a7c3-497e-a65f-9458686c8ea2\") " pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015500 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-modprobe-d\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015527 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-host\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-system-cni-dir\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-os-release\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.016577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-cni-binary-copy\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-systemd\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-system-cni-dir\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015697 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-sys-fs\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-sys\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07480989-b282-4bef-8d30-88f1027e6fb5-tmp\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zmmws\" (UniqueName: \"kubernetes.io/projected/07480989-b282-4bef-8d30-88f1027e6fb5-kube-api-access-zmmws\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015795 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-cnibin\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-kubernetes\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-run-ovn-kubernetes\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-system-cni-dir\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015916 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-cni-netd\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e24fef17-a7c3-497e-a65f-9458686c8ea2-serviceca\") pod \"node-ca-kt5t6\" (UID: \"e24fef17-a7c3-497e-a65f-9458686c8ea2\") " pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015952 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-var-lib-openvswitch\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.015967 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-var-lib-kubelet\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-sys-fs\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.017332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016004 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b39de8b0-dcb5-4dc5-81e3-2a24820be395-iptables-alerter-script\") pod \"iptables-alerter-869hf\" (UID: \"b39de8b0-dcb5-4dc5-81e3-2a24820be395\") " pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016029 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-modprobe-d\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016053 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-sys\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w9cr\" (UniqueName: \"kubernetes.io/projected/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-kube-api-access-2w9cr\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-kubelet\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016117 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-system-cni-dir\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-etc-selinux\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-os-release\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016204 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-conf-dir\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016230 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-daemon-config\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-log-socket\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016289 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c864774-d6f1-4d07-8798-126023861e55-ovnkube-config\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016318 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a40691e1-4691-4b8d-b935-ff781629806d-hosts-file\") pod \"node-resolver-m8gnt\" (UID: \"a40691e1-4691-4b8d-b935-ff781629806d\") " pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016343 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-device-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016370 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-cni-dir\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016396 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-run-netns\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016402 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-kubernetes\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-hostroot\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016447 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a8419133-562e-45db-abac-da560b01e6d9-konnectivity-ca\") pod \"konnectivity-agent-4np6h\" (UID: \"a8419133-562e-45db-abac-da560b01e6d9\") " pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-registration-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016501 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-socket-dir-parent\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xp5b\" (UniqueName: \"kubernetes.io/projected/7e54684e-7b38-446d-a750-4bb17e3d69b0-kube-api-access-5xp5b\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c864774-d6f1-4d07-8798-126023861e55-env-overrides\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-var-lib-cni-bin\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016642 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-os-release\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-cnibin\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016694 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-var-lib-cni-bin\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016706 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-kubelet\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016740 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a40691e1-4691-4b8d-b935-ff781629806d-hosts-file\") pod \"node-resolver-m8gnt\" (UID: \"a40691e1-4691-4b8d-b935-ff781629806d\") " pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-etc-selinux\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016799 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-device-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-os-release\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016849 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-cni-dir\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016855 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-conf-dir\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016886 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-hostroot\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.018781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-run-netns\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/07480989-b282-4bef-8d30-88f1027e6fb5-etc-tuned\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017006 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c864774-d6f1-4d07-8798-126023861e55-ovnkube-config\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-cni-netd\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017046 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-run-ovn-kubernetes\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-cni-binary-copy\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017120 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c864774-d6f1-4d07-8798-126023861e55-ovn-node-metrics-cert\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017170 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/87459b43-0232-4d86-97d1-bb0d563c1ecb-registration-dir\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-log-socket\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.016075 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-host\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017286 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-socket-dir-parent\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017392 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e24fef17-a7c3-497e-a65f-9458686c8ea2-host\") pod \"node-ca-kt5t6\" (UID: \"e24fef17-a7c3-497e-a65f-9458686c8ea2\") " pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-run-openvswitch\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017450 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c864774-d6f1-4d07-8798-126023861e55-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e54684e-7b38-446d-a750-4bb17e3d69b0-host-var-lib-kubelet\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.019351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.017677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/07480989-b282-4bef-8d30-88f1027e6fb5-etc-sysctl-conf\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.019985 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.018011 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7e54684e-7b38-446d-a750-4bb17e3d69b0-multus-daemon-config\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.019985 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.018094 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c864774-d6f1-4d07-8798-126023861e55-env-overrides\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.019985 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.018417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a8419133-562e-45db-abac-da560b01e6d9-konnectivity-ca\") pod \"konnectivity-agent-4np6h\" (UID: \"a8419133-562e-45db-abac-da560b01e6d9\") " pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:34.019985 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.019329 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/07480989-b282-4bef-8d30-88f1027e6fb5-tmp\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.020124 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.020009 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a8419133-562e-45db-abac-da560b01e6d9-agent-certs\") pod \"konnectivity-agent-4np6h\" (UID: \"a8419133-562e-45db-abac-da560b01e6d9\") " pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:34.024076 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.023912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9lg9\" (UniqueName: \"kubernetes.io/projected/a40691e1-4691-4b8d-b935-ff781629806d-kube-api-access-h9lg9\") pod \"node-resolver-m8gnt\" (UID: \"a40691e1-4691-4b8d-b935-ff781629806d\") " pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:34.024345 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.024319 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk2pm\" (UniqueName: \"kubernetes.io/projected/e24fef17-a7c3-497e-a65f-9458686c8ea2-kube-api-access-hk2pm\") pod \"node-ca-kt5t6\" (UID: \"e24fef17-a7c3-497e-a65f-9458686c8ea2\") " pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:34.024580 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.024500 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bhkr\" (UniqueName: \"kubernetes.io/projected/87459b43-0232-4d86-97d1-bb0d563c1ecb-kube-api-access-6bhkr\") pod \"aws-ebs-csi-driver-node-wxfnl\" (UID: \"87459b43-0232-4d86-97d1-bb0d563c1ecb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.024580 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.024519 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmmws\" (UniqueName: \"kubernetes.io/projected/07480989-b282-4bef-8d30-88f1027e6fb5-kube-api-access-zmmws\") pod \"tuned-ppndn\" (UID: \"07480989-b282-4bef-8d30-88f1027e6fb5\") " pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.024804 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.024769 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jqjc\" (UniqueName: \"kubernetes.io/projected/9c864774-d6f1-4d07-8798-126023861e55-kube-api-access-4jqjc\") pod \"ovnkube-node-5pn57\" (UID: \"9c864774-d6f1-4d07-8798-126023861e55\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.025137 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.025121 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh869\" (UniqueName: \"kubernetes.io/projected/3ed66159-86cf-4f43-824b-3905a5019c1c-kube-api-access-mh869\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:34.025894 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.025877 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w9cr\" (UniqueName: \"kubernetes.io/projected/5cb2ede9-d7bb-4d1f-9aca-83f7715b5495-kube-api-access-2w9cr\") pod \"multus-additional-cni-plugins-wz8rl\" (UID: \"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495\") " pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.026427 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.026412 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xp5b\" (UniqueName: \"kubernetes.io/projected/7e54684e-7b38-446d-a750-4bb17e3d69b0-kube-api-access-5xp5b\") pod \"multus-vr4xk\" (UID: \"7e54684e-7b38-446d-a750-4bb17e3d69b0\") " pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.055808 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.055775 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" event={"ID":"0eb09bb6acbd755bc18c097296960fb1","Type":"ContainerStarted","Data":"5ce1b0b5c3e0fd97d24c830fa9128e281dfb6965d29785d07548d9d0c83a8b06"} Apr 16 23:26:34.056600 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.056582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal" event={"ID":"be4647434e0fd79acaa8b46b7c163d65","Type":"ContainerStarted","Data":"e86266ed579c6d0c55758ccff255b97b82a475810d26b0b9945c88eafbe53b03"} Apr 16 23:26:34.117826 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.117809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b39de8b0-dcb5-4dc5-81e3-2a24820be395-iptables-alerter-script\") pod \"iptables-alerter-869hf\" (UID: \"b39de8b0-dcb5-4dc5-81e3-2a24820be395\") " pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.117921 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.117842 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:34.117921 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.117894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b39de8b0-dcb5-4dc5-81e3-2a24820be395-host-slash\") pod \"iptables-alerter-869hf\" (UID: \"b39de8b0-dcb5-4dc5-81e3-2a24820be395\") " pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.118032 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.118004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs6gq\" (UniqueName: \"kubernetes.io/projected/b39de8b0-dcb5-4dc5-81e3-2a24820be395-kube-api-access-qs6gq\") pod \"iptables-alerter-869hf\" (UID: \"b39de8b0-dcb5-4dc5-81e3-2a24820be395\") " pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.118089 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.118039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b39de8b0-dcb5-4dc5-81e3-2a24820be395-host-slash\") pod \"iptables-alerter-869hf\" (UID: \"b39de8b0-dcb5-4dc5-81e3-2a24820be395\") " pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.118797 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.118780 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b39de8b0-dcb5-4dc5-81e3-2a24820be395-iptables-alerter-script\") pod \"iptables-alerter-869hf\" (UID: \"b39de8b0-dcb5-4dc5-81e3-2a24820be395\") " pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.124018 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.124002 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:34.124087 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.124024 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:34.124087 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.124035 2573 projected.go:194] Error preparing data for projected volume kube-api-access-57mph for pod openshift-network-diagnostics/network-check-target-svp68: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:34.124182 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.124091 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph podName:df573a86-aad9-4aaa-9c40-5e9073ed8760 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:34.624073772 +0000 UTC m=+2.225992869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-57mph" (UniqueName: "kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph") pod "network-check-target-svp68" (UID: "df573a86-aad9-4aaa-9c40-5e9073ed8760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:34.125615 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.125600 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs6gq\" (UniqueName: \"kubernetes.io/projected/b39de8b0-dcb5-4dc5-81e3-2a24820be395-kube-api-access-qs6gq\") pod \"iptables-alerter-869hf\" (UID: \"b39de8b0-dcb5-4dc5-81e3-2a24820be395\") " pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.227979 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.227927 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" Apr 16 23:26:34.233660 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.233640 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87459b43_0232_4d86_97d1_bb0d563c1ecb.slice/crio-00860290b0dbf3142eb8b68548dfaea0d896e4e2d1263ea700efd0601af5d630 WatchSource:0}: Error finding container 00860290b0dbf3142eb8b68548dfaea0d896e4e2d1263ea700efd0601af5d630: Status 404 returned error can't find the container with id 00860290b0dbf3142eb8b68548dfaea0d896e4e2d1263ea700efd0601af5d630 Apr 16 23:26:34.247732 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.247717 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ppndn" Apr 16 23:26:34.251222 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.251206 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" Apr 16 23:26:34.252664 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.252644 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07480989_b282_4bef_8d30_88f1027e6fb5.slice/crio-6fcc83f83ca65e66c03712ae3a9a08b2df75ffe66d4d3f15bdffd5caff89f331 WatchSource:0}: Error finding container 6fcc83f83ca65e66c03712ae3a9a08b2df75ffe66d4d3f15bdffd5caff89f331: Status 404 returned error can't find the container with id 6fcc83f83ca65e66c03712ae3a9a08b2df75ffe66d4d3f15bdffd5caff89f331 Apr 16 23:26:34.258366 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.258345 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cb2ede9_d7bb_4d1f_9aca_83f7715b5495.slice/crio-b99fadc3a85ea577950883ea4d57655f92f69e62f97646596d18fe4594b85c00 WatchSource:0}: Error finding container b99fadc3a85ea577950883ea4d57655f92f69e62f97646596d18fe4594b85c00: Status 404 returned error can't find the container with id b99fadc3a85ea577950883ea4d57655f92f69e62f97646596d18fe4594b85c00 Apr 16 23:26:34.265209 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.265193 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vr4xk" Apr 16 23:26:34.270655 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.270633 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e54684e_7b38_446d_a750_4bb17e3d69b0.slice/crio-557896544bf8279e6963f475cde30c30e100c0624dc19d216c1018831eb56d79 WatchSource:0}: Error finding container 557896544bf8279e6963f475cde30c30e100c0624dc19d216c1018831eb56d79: Status 404 returned error can't find the container with id 557896544bf8279e6963f475cde30c30e100c0624dc19d216c1018831eb56d79 Apr 16 23:26:34.278558 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.278528 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:34.283571 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.283363 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c864774_d6f1_4d07_8798_126023861e55.slice/crio-85034043088cb2f7ca7a9b7ec53dac356e9f0fde0d7a7daaead1f5674034ff15 WatchSource:0}: Error finding container 85034043088cb2f7ca7a9b7ec53dac356e9f0fde0d7a7daaead1f5674034ff15: Status 404 returned error can't find the container with id 85034043088cb2f7ca7a9b7ec53dac356e9f0fde0d7a7daaead1f5674034ff15 Apr 16 23:26:34.301674 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.301657 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:34.307094 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.307074 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8419133_562e_45db_abac_da560b01e6d9.slice/crio-99cf42022fce093d50ee1ce2b3ae512ac0d4ab1972dbbce8d8d2fa8b46f53c9c WatchSource:0}: Error finding container 99cf42022fce093d50ee1ce2b3ae512ac0d4ab1972dbbce8d8d2fa8b46f53c9c: Status 404 returned error can't find the container with id 99cf42022fce093d50ee1ce2b3ae512ac0d4ab1972dbbce8d8d2fa8b46f53c9c Apr 16 23:26:34.311151 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.311137 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m8gnt" Apr 16 23:26:34.316338 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.316319 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda40691e1_4691_4b8d_b935_ff781629806d.slice/crio-fe5b7ae5c527d79e6ab48dff9ffca5942d077fc4b09db71b3e351199d1a265d8 WatchSource:0}: Error finding container fe5b7ae5c527d79e6ab48dff9ffca5942d077fc4b09db71b3e351199d1a265d8: Status 404 returned error can't find the container with id fe5b7ae5c527d79e6ab48dff9ffca5942d077fc4b09db71b3e351199d1a265d8 Apr 16 23:26:34.317906 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.317878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kt5t6" Apr 16 23:26:34.323291 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.323272 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode24fef17_a7c3_497e_a65f_9458686c8ea2.slice/crio-ec0c8747f08c609a0077b518683e829ce854775d6e43aa5d1ba5927d8163a0b5 WatchSource:0}: Error finding container ec0c8747f08c609a0077b518683e829ce854775d6e43aa5d1ba5927d8163a0b5: Status 404 returned error can't find the container with id ec0c8747f08c609a0077b518683e829ce854775d6e43aa5d1ba5927d8163a0b5 Apr 16 23:26:34.324636 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.324623 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-869hf" Apr 16 23:26:34.331333 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:26:34.331312 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39de8b0_dcb5_4dc5_81e3_2a24820be395.slice/crio-606fdb28892df0f573fac0480701f458e905ce71bf2f1d6511924745d1acff78 WatchSource:0}: Error finding container 606fdb28892df0f573fac0480701f458e905ce71bf2f1d6511924745d1acff78: Status 404 returned error can't find the container with id 606fdb28892df0f573fac0480701f458e905ce71bf2f1d6511924745d1acff78 Apr 16 23:26:34.521759 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.521686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:34.521893 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.521803 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:34.521893 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.521882 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs podName:3ed66159-86cf-4f43-824b-3905a5019c1c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:35.521861522 +0000 UTC m=+3.123780615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs") pod "network-metrics-daemon-qhz5v" (UID: "3ed66159-86cf-4f43-824b-3905a5019c1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:34.702959 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.702778 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:34.724754 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.724141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:34.724754 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.724303 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:34.724754 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.724330 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:34.724754 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.724345 2573 projected.go:194] Error preparing data for projected volume kube-api-access-57mph for pod openshift-network-diagnostics/network-check-target-svp68: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:34.724754 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:34.724396 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph podName:df573a86-aad9-4aaa-9c40-5e9073ed8760 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:35.724379269 +0000 UTC m=+3.326298363 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-57mph" (UniqueName: "kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph") pod "network-check-target-svp68" (UID: "df573a86-aad9-4aaa-9c40-5e9073ed8760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:34.939893 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.939814 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:21:33 +0000 UTC" deadline="2027-10-18 19:10:58.49222127 +0000 UTC" Apr 16 23:26:34.939893 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:34.939851 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13195h44m23.552373722s" Apr 16 23:26:35.056436 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.056406 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:35.056623 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:35.056520 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:35.071407 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.070553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kt5t6" event={"ID":"e24fef17-a7c3-497e-a65f-9458686c8ea2","Type":"ContainerStarted","Data":"ec0c8747f08c609a0077b518683e829ce854775d6e43aa5d1ba5927d8163a0b5"} Apr 16 23:26:35.072828 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.072796 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m8gnt" event={"ID":"a40691e1-4691-4b8d-b935-ff781629806d","Type":"ContainerStarted","Data":"fe5b7ae5c527d79e6ab48dff9ffca5942d077fc4b09db71b3e351199d1a265d8"} Apr 16 23:26:35.074391 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.074368 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerStarted","Data":"b99fadc3a85ea577950883ea4d57655f92f69e62f97646596d18fe4594b85c00"} Apr 16 23:26:35.078558 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.078519 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ppndn" event={"ID":"07480989-b282-4bef-8d30-88f1027e6fb5","Type":"ContainerStarted","Data":"6fcc83f83ca65e66c03712ae3a9a08b2df75ffe66d4d3f15bdffd5caff89f331"} Apr 16 23:26:35.088741 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.088716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" event={"ID":"87459b43-0232-4d86-97d1-bb0d563c1ecb","Type":"ContainerStarted","Data":"00860290b0dbf3142eb8b68548dfaea0d896e4e2d1263ea700efd0601af5d630"} Apr 16 23:26:35.092549 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.092513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4np6h" event={"ID":"a8419133-562e-45db-abac-da560b01e6d9","Type":"ContainerStarted","Data":"99cf42022fce093d50ee1ce2b3ae512ac0d4ab1972dbbce8d8d2fa8b46f53c9c"} Apr 16 23:26:35.095258 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.095233 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" event={"ID":"9c864774-d6f1-4d07-8798-126023861e55","Type":"ContainerStarted","Data":"85034043088cb2f7ca7a9b7ec53dac356e9f0fde0d7a7daaead1f5674034ff15"} Apr 16 23:26:35.105476 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.105451 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vr4xk" event={"ID":"7e54684e-7b38-446d-a750-4bb17e3d69b0","Type":"ContainerStarted","Data":"557896544bf8279e6963f475cde30c30e100c0624dc19d216c1018831eb56d79"} Apr 16 23:26:35.107765 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.107742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-869hf" event={"ID":"b39de8b0-dcb5-4dc5-81e3-2a24820be395","Type":"ContainerStarted","Data":"606fdb28892df0f573fac0480701f458e905ce71bf2f1d6511924745d1acff78"} Apr 16 23:26:35.335456 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.335132 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:35.530360 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.529838 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:35.530360 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:35.529970 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:35.530360 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:35.530038 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs podName:3ed66159-86cf-4f43-824b-3905a5019c1c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:37.53001947 +0000 UTC m=+5.131938551 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs") pod "network-metrics-daemon-qhz5v" (UID: "3ed66159-86cf-4f43-824b-3905a5019c1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:35.732856 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.732777 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:35.733015 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:35.732942 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:35.733015 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:35.732967 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:35.733015 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:35.732980 2573 projected.go:194] Error preparing data for projected volume kube-api-access-57mph for pod openshift-network-diagnostics/network-check-target-svp68: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:35.733178 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:35.733035 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph podName:df573a86-aad9-4aaa-9c40-5e9073ed8760 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:37.733016296 +0000 UTC m=+5.334935384 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-57mph" (UniqueName: "kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph") pod "network-check-target-svp68" (UID: "df573a86-aad9-4aaa-9c40-5e9073ed8760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:35.940904 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.940864 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 23:21:33 +0000 UTC" deadline="2028-01-16 11:42:13.550754504 +0000 UTC" Apr 16 23:26:35.940904 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:35.940901 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15348h15m37.609857392s" Apr 16 23:26:36.054026 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:36.053951 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:36.054172 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:36.054087 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:36.122704 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:36.122661 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 23:26:37.056549 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:37.056507 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:37.056973 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:37.056645 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:37.548728 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:37.548646 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:37.548870 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:37.548797 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:37.548870 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:37.548860 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs podName:3ed66159-86cf-4f43-824b-3905a5019c1c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:41.548841228 +0000 UTC m=+9.150760331 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs") pod "network-metrics-daemon-qhz5v" (UID: "3ed66159-86cf-4f43-824b-3905a5019c1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:37.750651 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:37.750619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:37.750828 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:37.750808 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:37.750886 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:37.750832 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:37.750886 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:37.750845 2573 projected.go:194] Error preparing data for projected volume kube-api-access-57mph for pod openshift-network-diagnostics/network-check-target-svp68: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:37.750991 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:37.750908 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph podName:df573a86-aad9-4aaa-9c40-5e9073ed8760 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:41.750889415 +0000 UTC m=+9.352808499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-57mph" (UniqueName: "kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph") pod "network-check-target-svp68" (UID: "df573a86-aad9-4aaa-9c40-5e9073ed8760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:38.053631 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:38.053601 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:38.053809 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:38.053752 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:39.053695 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:39.053661 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:39.054153 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:39.053788 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:40.054366 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:40.054335 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:40.054838 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:40.054481 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:41.054372 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:41.054342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:41.054827 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:41.054468 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:41.580977 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:41.580898 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:41.581145 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:41.581069 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:41.581200 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:41.581147 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs podName:3ed66159-86cf-4f43-824b-3905a5019c1c nodeName:}" failed. No retries permitted until 2026-04-16 23:26:49.58112663 +0000 UTC m=+17.183045722 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs") pod "network-metrics-daemon-qhz5v" (UID: "3ed66159-86cf-4f43-824b-3905a5019c1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:41.782546 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:41.782495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:41.782721 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:41.782652 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:41.782721 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:41.782703 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:41.782721 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:41.782717 2573 projected.go:194] Error preparing data for projected volume kube-api-access-57mph for pod openshift-network-diagnostics/network-check-target-svp68: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:41.782885 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:41.782777 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph podName:df573a86-aad9-4aaa-9c40-5e9073ed8760 nodeName:}" failed. No retries permitted until 2026-04-16 23:26:49.782757685 +0000 UTC m=+17.384676786 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-57mph" (UniqueName: "kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph") pod "network-check-target-svp68" (UID: "df573a86-aad9-4aaa-9c40-5e9073ed8760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:42.054589 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:42.054035 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:42.054589 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:42.054181 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:43.055497 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:43.055400 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:43.055955 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:43.055509 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:44.054362 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:44.054276 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:44.054502 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:44.054405 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:45.053776 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:45.053746 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:45.054242 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:45.053844 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:46.054343 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:46.054307 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:46.054820 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:46.054442 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:47.054104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:47.054071 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:47.054296 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:47.054213 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:48.053694 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:48.053661 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:48.054131 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:48.053777 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:49.053930 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:49.053897 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:49.054390 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:49.054027 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:49.644416 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:49.644377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:49.644644 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:49.644526 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:49.644644 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:49.644611 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs podName:3ed66159-86cf-4f43-824b-3905a5019c1c nodeName:}" failed. No retries permitted until 2026-04-16 23:27:05.644589914 +0000 UTC m=+33.246508997 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs") pod "network-metrics-daemon-qhz5v" (UID: "3ed66159-86cf-4f43-824b-3905a5019c1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:26:49.844897 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:49.844864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:49.845051 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:49.844990 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:26:49.845051 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:49.845013 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:26:49.845051 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:49.845023 2573 projected.go:194] Error preparing data for projected volume kube-api-access-57mph for pod openshift-network-diagnostics/network-check-target-svp68: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:49.845181 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:49.845071 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph podName:df573a86-aad9-4aaa-9c40-5e9073ed8760 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:05.845057436 +0000 UTC m=+33.446976516 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-57mph" (UniqueName: "kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph") pod "network-check-target-svp68" (UID: "df573a86-aad9-4aaa-9c40-5e9073ed8760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:26:50.054327 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:50.054301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:50.054739 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:50.054420 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:51.053972 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:51.053938 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:51.054148 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:51.054045 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:52.053692 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:52.053674 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:52.053943 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:52.053761 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:53.054399 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.054122 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:53.055010 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:53.054461 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:53.137125 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.137085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kt5t6" event={"ID":"e24fef17-a7c3-497e-a65f-9458686c8ea2","Type":"ContainerStarted","Data":"1a6b4f6fe34406ca88057291a2f1afc83456e67e338b30f295ae482b1a07e554"} Apr 16 23:26:53.138671 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.138630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m8gnt" event={"ID":"a40691e1-4691-4b8d-b935-ff781629806d","Type":"ContainerStarted","Data":"d4b1933d9c1261392ede0f366192be4cb7bb669abd34d99388cea69fb054d83f"} Apr 16 23:26:53.140065 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.140023 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerStarted","Data":"134b6a4a6d2949f69a06e544dfe0668e4ce0f4360d5152d6e7f9b0589f23554c"} Apr 16 23:26:53.141334 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.141304 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ppndn" event={"ID":"07480989-b282-4bef-8d30-88f1027e6fb5","Type":"ContainerStarted","Data":"749f85950835795cc2adddcb5f6ce4b18873f599a56e4e83780105ab689d2e8d"} Apr 16 23:26:53.142755 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.142736 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" event={"ID":"87459b43-0232-4d86-97d1-bb0d563c1ecb","Type":"ContainerStarted","Data":"088827cc210fbf1b60b9b54356288710aa039f6872c76ebe3a25d0033c751314"} Apr 16 23:26:53.144162 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.144143 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" event={"ID":"0eb09bb6acbd755bc18c097296960fb1","Type":"ContainerStarted","Data":"03b4922f1006d08f10d524661a5faceb7abec7dbdc94954c711b498dec72ee33"} Apr 16 23:26:53.145592 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.145562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal" event={"ID":"be4647434e0fd79acaa8b46b7c163d65","Type":"ContainerStarted","Data":"9042a08c33bf646ddf95dccf04b7eab434fcf962da37f06afc7a5527d19a7562"} Apr 16 23:26:53.147253 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.147235 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-4np6h" event={"ID":"a8419133-562e-45db-abac-da560b01e6d9","Type":"ContainerStarted","Data":"7b71ed7fc7ca54e8825b7f78d549c4308a1d18bd1a12dd1353a94aa68d9999ab"} Apr 16 23:26:53.149906 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.149882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" event={"ID":"9c864774-d6f1-4d07-8798-126023861e55","Type":"ContainerStarted","Data":"006c2f7d17bbc5272f6dbe9e16077584e14796e762c8686e9028e9ab39209c46"} Apr 16 23:26:53.149906 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.149903 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" event={"ID":"9c864774-d6f1-4d07-8798-126023861e55","Type":"ContainerStarted","Data":"dac23921855413dbc26353bdd1a4d9e79fa5670508b644ba5836eac8f654d44f"} Apr 16 23:26:53.150042 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.149918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" event={"ID":"9c864774-d6f1-4d07-8798-126023861e55","Type":"ContainerStarted","Data":"2017e4e2986c571614c2f7ba5d1efdcc88be12455700fe0e82d4155ed489272d"} Apr 16 23:26:53.150042 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.149929 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" event={"ID":"9c864774-d6f1-4d07-8798-126023861e55","Type":"ContainerStarted","Data":"68d5b0078a800ead2c4fb942e6624b63c2ee238e68d451d5b3576142af872583"} Apr 16 23:26:53.150042 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.149940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" event={"ID":"9c864774-d6f1-4d07-8798-126023861e55","Type":"ContainerStarted","Data":"916e97e410f0dc00616464e6f934fed085aa63f70e1549efd416222d63d7b4e6"} Apr 16 23:26:53.150042 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.149952 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" event={"ID":"9c864774-d6f1-4d07-8798-126023861e55","Type":"ContainerStarted","Data":"29dd10cdf698a23337a6de83171a8b459b3e7c555f9ceca6000534b23b5cf050"} Apr 16 23:26:53.151137 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.151114 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vr4xk" event={"ID":"7e54684e-7b38-446d-a750-4bb17e3d69b0","Type":"ContainerStarted","Data":"7eacbcffedd04268680d575aa272693fbf9995fa63d2fd6583425016e409436d"} Apr 16 23:26:53.151272 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.151221 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kt5t6" podStartSLOduration=2.509267076 podStartE2EDuration="20.151205481s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:34.324575588 +0000 UTC m=+1.926494667" lastFinishedPulling="2026-04-16 23:26:51.966513994 +0000 UTC m=+19.568433072" observedRunningTime="2026-04-16 23:26:53.150587832 +0000 UTC m=+20.752506930" watchObservedRunningTime="2026-04-16 23:26:53.151205481 +0000 UTC m=+20.753124583" Apr 16 23:26:53.161430 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.161397 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-153.ec2.internal" podStartSLOduration=20.161387924 podStartE2EDuration="20.161387924s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:26:53.160918952 +0000 UTC m=+20.762838053" watchObservedRunningTime="2026-04-16 23:26:53.161387924 +0000 UTC m=+20.763307024" Apr 16 23:26:53.185764 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.185737 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:53.186379 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.186359 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:26:53.195603 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.195502 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-4np6h" podStartSLOduration=2.5676650580000002 podStartE2EDuration="20.195490162s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:34.308349795 +0000 UTC m=+1.910268874" lastFinishedPulling="2026-04-16 23:26:51.936174899 +0000 UTC m=+19.538093978" observedRunningTime="2026-04-16 23:26:53.195147993 +0000 UTC m=+20.797067096" watchObservedRunningTime="2026-04-16 23:26:53.195490162 +0000 UTC m=+20.797409275" Apr 16 23:26:53.208575 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.208514 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m8gnt" podStartSLOduration=2.590016421 podStartE2EDuration="20.208500416s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:34.317672498 +0000 UTC m=+1.919591576" lastFinishedPulling="2026-04-16 23:26:51.936156473 +0000 UTC m=+19.538075571" observedRunningTime="2026-04-16 23:26:53.207931199 +0000 UTC m=+20.809850323" watchObservedRunningTime="2026-04-16 23:26:53.208500416 +0000 UTC m=+20.810419519" Apr 16 23:26:53.234511 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.234476 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ppndn" podStartSLOduration=2.525189881 podStartE2EDuration="20.234466969s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:34.255093054 +0000 UTC m=+1.857012137" lastFinishedPulling="2026-04-16 23:26:51.964370127 +0000 UTC m=+19.566289225" observedRunningTime="2026-04-16 23:26:53.234247898 +0000 UTC m=+20.836167000" watchObservedRunningTime="2026-04-16 23:26:53.234466969 +0000 UTC m=+20.836386070" Apr 16 23:26:53.248885 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:53.248855 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vr4xk" podStartSLOduration=2.5159163380000003 podStartE2EDuration="20.248845343s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:34.27177758 +0000 UTC m=+1.873696662" lastFinishedPulling="2026-04-16 23:26:52.004706584 +0000 UTC m=+19.606625667" observedRunningTime="2026-04-16 23:26:53.248569898 +0000 UTC m=+20.850489000" watchObservedRunningTime="2026-04-16 23:26:53.248845343 +0000 UTC m=+20.850764444" Apr 16 23:26:54.012381 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.012350 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 23:26:54.054437 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.054408 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:54.054863 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:54.054552 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:54.154683 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.154619 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-869hf" event={"ID":"b39de8b0-dcb5-4dc5-81e3-2a24820be395","Type":"ContainerStarted","Data":"290abf1e415df9e2fe7ec0794e321dca0c353b4b0a061d318dad984216079749"} Apr 16 23:26:54.155945 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.155900 2573 generic.go:358] "Generic (PLEG): container finished" podID="5cb2ede9-d7bb-4d1f-9aca-83f7715b5495" containerID="134b6a4a6d2949f69a06e544dfe0668e4ce0f4360d5152d6e7f9b0589f23554c" exitCode=0 Apr 16 23:26:54.156061 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.155970 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerDied","Data":"134b6a4a6d2949f69a06e544dfe0668e4ce0f4360d5152d6e7f9b0589f23554c"} Apr 16 23:26:54.157827 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.157802 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" event={"ID":"87459b43-0232-4d86-97d1-bb0d563c1ecb","Type":"ContainerStarted","Data":"33c7822b5fdec7ea802af6582927aae626771f7f0cd3aa7a93305f839a78bfbc"} Apr 16 23:26:54.159384 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.159362 2573 generic.go:358] "Generic (PLEG): container finished" podID="0eb09bb6acbd755bc18c097296960fb1" containerID="03b4922f1006d08f10d524661a5faceb7abec7dbdc94954c711b498dec72ee33" exitCode=0 Apr 16 23:26:54.159527 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.159473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" event={"ID":"0eb09bb6acbd755bc18c097296960fb1","Type":"ContainerDied","Data":"03b4922f1006d08f10d524661a5faceb7abec7dbdc94954c711b498dec72ee33"} Apr 16 23:26:54.159527 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.159513 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" event={"ID":"0eb09bb6acbd755bc18c097296960fb1","Type":"ContainerStarted","Data":"90341507354af2faaeebf11d1023936913a2edcdcd8766e0847c374f344a1486"} Apr 16 23:26:54.168656 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.168619 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-869hf" podStartSLOduration=3.53466324 podStartE2EDuration="21.168605795s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:34.332804193 +0000 UTC m=+1.934723272" lastFinishedPulling="2026-04-16 23:26:51.966746731 +0000 UTC m=+19.568665827" observedRunningTime="2026-04-16 23:26:54.168001363 +0000 UTC m=+21.769920466" watchObservedRunningTime="2026-04-16 23:26:54.168605795 +0000 UTC m=+21.770524896" Apr 16 23:26:54.983351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.983071 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T23:26:54.012368722Z","UUID":"67679ff1-1fcb-4ae8-ab68-4f308812c15c","Handler":null,"Name":"","Endpoint":""} Apr 16 23:26:54.984754 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.984733 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 23:26:54.984880 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:54.984759 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 23:26:55.053754 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:55.053728 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:55.053878 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:55.053851 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:55.163104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:55.163041 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" event={"ID":"87459b43-0232-4d86-97d1-bb0d563c1ecb","Type":"ContainerStarted","Data":"bb21b007aa934567a36eaf1ab20b66eec65adc9dab2fa5587875ffb7ce41e349"} Apr 16 23:26:55.165607 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:55.165579 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" event={"ID":"9c864774-d6f1-4d07-8798-126023861e55","Type":"ContainerStarted","Data":"1db61abda95cbc9ebd2110a067ed83f33068c77730b6e3096c3f3df33a02f307"} Apr 16 23:26:55.165717 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:55.165641 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:26:55.178223 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:55.178183 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wxfnl" podStartSLOduration=1.520404504 podStartE2EDuration="22.178174177s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:34.235420596 +0000 UTC m=+1.837339675" lastFinishedPulling="2026-04-16 23:26:54.893190268 +0000 UTC m=+22.495109348" observedRunningTime="2026-04-16 23:26:55.178067038 +0000 UTC m=+22.779986136" watchObservedRunningTime="2026-04-16 23:26:55.178174177 +0000 UTC m=+22.780093314" Apr 16 23:26:55.178348 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:55.178313 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-153.ec2.internal" podStartSLOduration=22.17831043 podStartE2EDuration="22.17831043s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:26:54.201250724 +0000 UTC m=+21.803169824" watchObservedRunningTime="2026-04-16 23:26:55.17831043 +0000 UTC m=+22.780229531" Apr 16 23:26:56.054661 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:56.054626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:56.054830 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:56.054731 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:57.054721 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:57.054697 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:57.055307 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:57.054815 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:58.053988 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:58.053949 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:26:58.054155 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:58.054065 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:26:59.053790 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.053607 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:26:59.054175 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:26:59.053833 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:26:59.173990 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.173964 2573 generic.go:358] "Generic (PLEG): container finished" podID="5cb2ede9-d7bb-4d1f-9aca-83f7715b5495" containerID="f2278faa4f5cc81fb7affef642883206f214ad85dd4ee548a3f8ab78802f26fc" exitCode=0 Apr 16 23:26:59.174091 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.174049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerDied","Data":"f2278faa4f5cc81fb7affef642883206f214ad85dd4ee548a3f8ab78802f26fc"} Apr 16 23:26:59.177589 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.177560 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" event={"ID":"9c864774-d6f1-4d07-8798-126023861e55","Type":"ContainerStarted","Data":"4b38dc486a05c74991d57f6d22c5ce08dcb7218beaf2db34951adfc0c1c6f063"} Apr 16 23:26:59.177890 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.177867 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:59.177993 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.177901 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:59.181898 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.177923 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:59.193015 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.192993 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:59.193203 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.193188 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:26:59.215516 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:26:59.215483 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" podStartSLOduration=7.972672237 podStartE2EDuration="26.215471699s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:34.284881633 +0000 UTC m=+1.886800712" lastFinishedPulling="2026-04-16 23:26:52.52768109 +0000 UTC m=+20.129600174" observedRunningTime="2026-04-16 23:26:59.215232163 +0000 UTC m=+26.817151258" watchObservedRunningTime="2026-04-16 23:26:59.215471699 +0000 UTC m=+26.817390800" Apr 16 23:27:00.054547 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:00.054500 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:27:00.055090 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:00.054660 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:27:00.102031 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:00.102002 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:27:00.102190 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:00.102118 2573 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 23:27:00.102587 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:00.102569 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-4np6h" Apr 16 23:27:00.741716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:00.741559 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-svp68"] Apr 16 23:27:00.741840 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:00.741817 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:27:00.741941 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:00.741917 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:27:00.744212 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:00.744192 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qhz5v"] Apr 16 23:27:00.744316 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:00.744279 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:27:00.744399 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:00.744380 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:27:01.182187 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:01.182161 2573 generic.go:358] "Generic (PLEG): container finished" podID="5cb2ede9-d7bb-4d1f-9aca-83f7715b5495" containerID="027b3aca23f0da14c8d9956b938d8f3f7c79558b63041ba6d3b519f3c3399611" exitCode=0 Apr 16 23:27:01.182498 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:01.182245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerDied","Data":"027b3aca23f0da14c8d9956b938d8f3f7c79558b63041ba6d3b519f3c3399611"} Apr 16 23:27:02.054094 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:02.054068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:27:02.054251 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:02.054177 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:27:02.185967 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:02.185938 2573 generic.go:358] "Generic (PLEG): container finished" podID="5cb2ede9-d7bb-4d1f-9aca-83f7715b5495" containerID="39d9a41fa698f34c17060e7f4a891067a2b11f74189e3eb5b6b2dd8382f13715" exitCode=0 Apr 16 23:27:02.186438 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:02.185991 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerDied","Data":"39d9a41fa698f34c17060e7f4a891067a2b11f74189e3eb5b6b2dd8382f13715"} Apr 16 23:27:03.055029 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:03.054996 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:27:03.055208 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:03.055108 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:27:04.053832 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:04.053805 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:27:04.054366 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:04.053916 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-svp68" podUID="df573a86-aad9-4aaa-9c40-5e9073ed8760" Apr 16 23:27:05.053633 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.053604 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:27:05.053749 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.053724 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qhz5v" podUID="3ed66159-86cf-4f43-824b-3905a5019c1c" Apr 16 23:27:05.234126 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.234054 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-153.ec2.internal" event="NodeReady" Apr 16 23:27:05.234593 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.234185 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 23:27:05.274733 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.274701 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2cxtr"] Apr 16 23:27:05.276423 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.276406 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.279130 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.278928 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 23:27:05.279130 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.279069 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 23:27:05.279372 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.279354 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-89jck\"" Apr 16 23:27:05.280145 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.280121 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fdk7d"] Apr 16 23:27:05.281932 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.281912 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:05.284091 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.283931 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 23:27:05.284091 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.283997 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 23:27:05.284091 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.284002 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m6886\"" Apr 16 23:27:05.284091 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.284051 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 23:27:05.287907 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.287885 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2cxtr"] Apr 16 23:27:05.290020 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.290001 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fdk7d"] Apr 16 23:27:05.366257 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.366232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbf7e722-63c3-4ad8-b126-d39966fa38f3-tmp-dir\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.366357 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.366267 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:05.366357 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.366292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbf7e722-63c3-4ad8-b126-d39966fa38f3-config-volume\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.366357 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.366307 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.366357 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.366325 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b8ph\" (UniqueName: \"kubernetes.io/projected/fbf7e722-63c3-4ad8-b126-d39966fa38f3-kube-api-access-7b8ph\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.366357 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.366340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xzfr\" (UniqueName: \"kubernetes.io/projected/9ee4f424-f877-4116-acf8-e4a8c70b5329-kube-api-access-7xzfr\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:05.466675 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.466650 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbf7e722-63c3-4ad8-b126-d39966fa38f3-config-volume\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.466675 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.466678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.466858 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.466700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b8ph\" (UniqueName: \"kubernetes.io/projected/fbf7e722-63c3-4ad8-b126-d39966fa38f3-kube-api-access-7b8ph\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.466858 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.466719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xzfr\" (UniqueName: \"kubernetes.io/projected/9ee4f424-f877-4116-acf8-e4a8c70b5329-kube-api-access-7xzfr\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:05.466858 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.466786 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbf7e722-63c3-4ad8-b126-d39966fa38f3-tmp-dir\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.466858 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.466826 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:05.466858 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.466849 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:05.467085 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.466907 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls podName:fbf7e722-63c3-4ad8-b126-d39966fa38f3 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:05.966889965 +0000 UTC m=+33.568809061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls") pod "dns-default-2cxtr" (UID: "fbf7e722-63c3-4ad8-b126-d39966fa38f3") : secret "dns-default-metrics-tls" not found Apr 16 23:27:05.467085 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.466984 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:05.467085 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.467024 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert podName:9ee4f424-f877-4116-acf8-e4a8c70b5329 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:05.96701117 +0000 UTC m=+33.568930264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert") pod "ingress-canary-fdk7d" (UID: "9ee4f424-f877-4116-acf8-e4a8c70b5329") : secret "canary-serving-cert" not found Apr 16 23:27:05.467255 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.467125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fbf7e722-63c3-4ad8-b126-d39966fa38f3-tmp-dir\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.467255 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.467206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fbf7e722-63c3-4ad8-b126-d39966fa38f3-config-volume\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.476775 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.476753 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b8ph\" (UniqueName: \"kubernetes.io/projected/fbf7e722-63c3-4ad8-b126-d39966fa38f3-kube-api-access-7b8ph\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.476930 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.476913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xzfr\" (UniqueName: \"kubernetes.io/projected/9ee4f424-f877-4116-acf8-e4a8c70b5329-kube-api-access-7xzfr\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:05.667751 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.667678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:27:05.667900 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.667836 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:27:05.667956 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.667906 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs podName:3ed66159-86cf-4f43-824b-3905a5019c1c nodeName:}" failed. No retries permitted until 2026-04-16 23:27:37.667891813 +0000 UTC m=+65.269810896 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs") pod "network-metrics-daemon-qhz5v" (UID: "3ed66159-86cf-4f43-824b-3905a5019c1c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 23:27:05.868829 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.868788 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:27:05.868988 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.868923 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 23:27:05.868988 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.868946 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 23:27:05.868988 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.868958 2573 projected.go:194] Error preparing data for projected volume kube-api-access-57mph for pod openshift-network-diagnostics/network-check-target-svp68: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:27:05.869157 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.869011 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph podName:df573a86-aad9-4aaa-9c40-5e9073ed8760 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:37.868993388 +0000 UTC m=+65.470912471 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-57mph" (UniqueName: "kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph") pod "network-check-target-svp68" (UID: "df573a86-aad9-4aaa-9c40-5e9073ed8760") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 23:27:05.969546 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.969499 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:05.969715 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:05.969569 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:05.969715 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.969671 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:05.969831 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.969731 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:05.969831 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.969735 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert podName:9ee4f424-f877-4116-acf8-e4a8c70b5329 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:06.969719478 +0000 UTC m=+34.571638563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert") pod "ingress-canary-fdk7d" (UID: "9ee4f424-f877-4116-acf8-e4a8c70b5329") : secret "canary-serving-cert" not found Apr 16 23:27:05.969831 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:05.969803 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls podName:fbf7e722-63c3-4ad8-b126-d39966fa38f3 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:06.969784321 +0000 UTC m=+34.571703410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls") pod "dns-default-2cxtr" (UID: "fbf7e722-63c3-4ad8-b126-d39966fa38f3") : secret "dns-default-metrics-tls" not found Apr 16 23:27:06.053962 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:06.053931 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:27:06.056744 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:06.056719 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:27:06.056878 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:06.056787 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:27:06.056878 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:06.056841 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-grjf2\"" Apr 16 23:27:06.975919 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:06.975882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:06.976573 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:06.975938 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:06.976573 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:06.976046 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:06.976573 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:06.976089 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:06.976573 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:06.976128 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert podName:9ee4f424-f877-4116-acf8-e4a8c70b5329 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:08.976105041 +0000 UTC m=+36.578024121 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert") pod "ingress-canary-fdk7d" (UID: "9ee4f424-f877-4116-acf8-e4a8c70b5329") : secret "canary-serving-cert" not found Apr 16 23:27:06.976573 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:06.976150 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls podName:fbf7e722-63c3-4ad8-b126-d39966fa38f3 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:08.97613885 +0000 UTC m=+36.578057929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls") pod "dns-default-2cxtr" (UID: "fbf7e722-63c3-4ad8-b126-d39966fa38f3") : secret "dns-default-metrics-tls" not found Apr 16 23:27:07.054301 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:07.054267 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:27:07.058728 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:07.056822 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:27:07.058728 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:07.056837 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wl6xg\"" Apr 16 23:27:08.199008 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:08.198976 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerStarted","Data":"64697ab1bbe9eac000c00ef0d26c0c36595fd5d4c94a858812a8fefd7bb1df5b"} Apr 16 23:27:08.990665 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:08.990473 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:08.990798 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:08.990680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:08.990798 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:08.990614 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:08.990798 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:08.990781 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert podName:9ee4f424-f877-4116-acf8-e4a8c70b5329 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:12.990766579 +0000 UTC m=+40.592685657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert") pod "ingress-canary-fdk7d" (UID: "9ee4f424-f877-4116-acf8-e4a8c70b5329") : secret "canary-serving-cert" not found Apr 16 23:27:08.990798 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:08.990786 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:08.990945 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:08.990823 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls podName:fbf7e722-63c3-4ad8-b126-d39966fa38f3 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:12.990811861 +0000 UTC m=+40.592730941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls") pod "dns-default-2cxtr" (UID: "fbf7e722-63c3-4ad8-b126-d39966fa38f3") : secret "dns-default-metrics-tls" not found Apr 16 23:27:09.202728 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:09.202700 2573 generic.go:358] "Generic (PLEG): container finished" podID="5cb2ede9-d7bb-4d1f-9aca-83f7715b5495" containerID="64697ab1bbe9eac000c00ef0d26c0c36595fd5d4c94a858812a8fefd7bb1df5b" exitCode=0 Apr 16 23:27:09.203025 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:09.202753 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerDied","Data":"64697ab1bbe9eac000c00ef0d26c0c36595fd5d4c94a858812a8fefd7bb1df5b"} Apr 16 23:27:10.207151 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:10.207120 2573 generic.go:358] "Generic (PLEG): container finished" podID="5cb2ede9-d7bb-4d1f-9aca-83f7715b5495" containerID="3e2a9f46235b65a8de98672975b1b04d291c0f9af4ee3ad0c63fa83677685274" exitCode=0 Apr 16 23:27:10.207495 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:10.207164 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerDied","Data":"3e2a9f46235b65a8de98672975b1b04d291c0f9af4ee3ad0c63fa83677685274"} Apr 16 23:27:11.211474 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:11.211437 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" event={"ID":"5cb2ede9-d7bb-4d1f-9aca-83f7715b5495","Type":"ContainerStarted","Data":"733b0b0bb78af0a1f9b516dd3b2575cf437e7e56803e01ba9faed200657664be"} Apr 16 23:27:11.230649 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:11.230609 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wz8rl" podStartSLOduration=4.479077242 podStartE2EDuration="38.230595248s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:26:34.259838999 +0000 UTC m=+1.861758078" lastFinishedPulling="2026-04-16 23:27:08.011357002 +0000 UTC m=+35.613276084" observedRunningTime="2026-04-16 23:27:11.230504255 +0000 UTC m=+38.832423357" watchObservedRunningTime="2026-04-16 23:27:11.230595248 +0000 UTC m=+38.832514352" Apr 16 23:27:13.016969 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:13.016939 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:13.016969 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:13.016973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:13.017406 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:13.017077 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:13.017406 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:13.017090 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:13.017406 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:13.017122 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls podName:fbf7e722-63c3-4ad8-b126-d39966fa38f3 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:21.017109776 +0000 UTC m=+48.619028855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls") pod "dns-default-2cxtr" (UID: "fbf7e722-63c3-4ad8-b126-d39966fa38f3") : secret "dns-default-metrics-tls" not found Apr 16 23:27:13.017406 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:13.017152 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert podName:9ee4f424-f877-4116-acf8-e4a8c70b5329 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:21.017134221 +0000 UTC m=+48.619053304 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert") pod "ingress-canary-fdk7d" (UID: "9ee4f424-f877-4116-acf8-e4a8c70b5329") : secret "canary-serving-cert" not found Apr 16 23:27:21.071038 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:21.071006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:21.071038 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:21.071040 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:21.071598 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:21.071138 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:21.071598 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:21.071142 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:21.071598 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:21.071189 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls podName:fbf7e722-63c3-4ad8-b126-d39966fa38f3 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:37.071174939 +0000 UTC m=+64.673094017 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls") pod "dns-default-2cxtr" (UID: "fbf7e722-63c3-4ad8-b126-d39966fa38f3") : secret "dns-default-metrics-tls" not found Apr 16 23:27:21.071598 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:21.071202 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert podName:9ee4f424-f877-4116-acf8-e4a8c70b5329 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:37.071196836 +0000 UTC m=+64.673115914 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert") pod "ingress-canary-fdk7d" (UID: "9ee4f424-f877-4116-acf8-e4a8c70b5329") : secret "canary-serving-cert" not found Apr 16 23:27:31.193054 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:31.193028 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pn57" Apr 16 23:27:37.074631 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:37.074596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:27:37.074631 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:37.074632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:27:37.075119 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:37.074732 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:27:37.075119 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:37.074739 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:27:37.075119 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:37.074786 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert podName:9ee4f424-f877-4116-acf8-e4a8c70b5329 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:09.074772497 +0000 UTC m=+96.676691577 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert") pod "ingress-canary-fdk7d" (UID: "9ee4f424-f877-4116-acf8-e4a8c70b5329") : secret "canary-serving-cert" not found Apr 16 23:27:37.075119 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:37.074801 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls podName:fbf7e722-63c3-4ad8-b126-d39966fa38f3 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:09.074795371 +0000 UTC m=+96.676714449 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls") pod "dns-default-2cxtr" (UID: "fbf7e722-63c3-4ad8-b126-d39966fa38f3") : secret "dns-default-metrics-tls" not found Apr 16 23:27:37.679077 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:37.679046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:27:37.681599 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:37.681577 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 23:27:37.689371 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:37.689352 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 23:27:37.689426 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:37.689406 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs podName:3ed66159-86cf-4f43-824b-3905a5019c1c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:41.689390951 +0000 UTC m=+129.291310031 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs") pod "network-metrics-daemon-qhz5v" (UID: "3ed66159-86cf-4f43-824b-3905a5019c1c") : secret "metrics-daemon-secret" not found Apr 16 23:27:37.880634 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:37.880598 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:27:37.883169 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:37.883151 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 23:27:37.893472 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:37.893454 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 23:27:37.905487 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:37.905463 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-57mph\" (UniqueName: \"kubernetes.io/projected/df573a86-aad9-4aaa-9c40-5e9073ed8760-kube-api-access-57mph\") pod \"network-check-target-svp68\" (UID: \"df573a86-aad9-4aaa-9c40-5e9073ed8760\") " pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:27:38.166681 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:38.166629 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-grjf2\"" Apr 16 23:27:38.175245 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:38.175226 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:27:38.294521 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:38.294486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-svp68"] Apr 16 23:27:38.298697 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:27:38.298669 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf573a86_aad9_4aaa_9c40_5e9073ed8760.slice/crio-01e1668e1d1bc3b80dfb2629ca25f142352412cd45455349131eac583edb6c02 WatchSource:0}: Error finding container 01e1668e1d1bc3b80dfb2629ca25f142352412cd45455349131eac583edb6c02: Status 404 returned error can't find the container with id 01e1668e1d1bc3b80dfb2629ca25f142352412cd45455349131eac583edb6c02 Apr 16 23:27:39.266147 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:39.266113 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-svp68" event={"ID":"df573a86-aad9-4aaa-9c40-5e9073ed8760","Type":"ContainerStarted","Data":"01e1668e1d1bc3b80dfb2629ca25f142352412cd45455349131eac583edb6c02"} Apr 16 23:27:42.274467 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:42.274358 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-svp68" event={"ID":"df573a86-aad9-4aaa-9c40-5e9073ed8760","Type":"ContainerStarted","Data":"643e2cc9f5d8da79686229f454c4003a9b31f1865934d4390a56e97a4fa0cdc5"} Apr 16 23:27:42.274834 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:42.274492 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:27:42.287962 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:42.287907 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-svp68" podStartSLOduration=66.393430303 podStartE2EDuration="1m9.287891503s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:27:38.300267191 +0000 UTC m=+65.902186271" lastFinishedPulling="2026-04-16 23:27:41.194728392 +0000 UTC m=+68.796647471" observedRunningTime="2026-04-16 23:27:42.287564066 +0000 UTC m=+69.889483167" watchObservedRunningTime="2026-04-16 23:27:42.287891503 +0000 UTC m=+69.889810604" Apr 16 23:27:58.986105 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:58.986069 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5"] Apr 16 23:27:58.990346 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:58.990329 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:58.993664 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:58.993633 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 23:27:58.993800 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:58.993683 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 23:27:58.994549 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:58.994516 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 23:27:58.994661 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:58.994626 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 23:27:58.994661 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:58.994629 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-qkmhq\"" Apr 16 23:27:58.998032 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:58.998011 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5"] Apr 16 23:27:59.017187 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.017160 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/84d158a6-dfbe-407b-be72-b82ba7380fd8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:59.017283 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.017202 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:59.017283 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.017247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwxnz\" (UniqueName: \"kubernetes.io/projected/84d158a6-dfbe-407b-be72-b82ba7380fd8-kube-api-access-rwxnz\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:59.087827 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.087804 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5897bbd496-f9ftq"] Apr 16 23:27:59.090644 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.090629 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.092926 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.092906 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 23:27:59.093160 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.093137 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-cbs68\"" Apr 16 23:27:59.093160 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.093153 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 23:27:59.093339 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.093161 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 23:27:59.093339 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.093164 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 23:27:59.093339 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.093301 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 23:27:59.093490 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.093356 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 23:27:59.098774 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.098753 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5897bbd496-f9ftq"] Apr 16 23:27:59.118389 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.118365 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/84d158a6-dfbe-407b-be72-b82ba7380fd8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:59.118477 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.118396 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-default-certificate\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.118477 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.118426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:59.118477 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.118451 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-stats-auth\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.118477 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.118475 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.118639 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.118563 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:27:59.118639 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.118562 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwxnz\" (UniqueName: \"kubernetes.io/projected/84d158a6-dfbe-407b-be72-b82ba7380fd8-kube-api-access-rwxnz\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:59.118639 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.118625 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls podName:84d158a6-dfbe-407b-be72-b82ba7380fd8 nodeName:}" failed. No retries permitted until 2026-04-16 23:27:59.618609948 +0000 UTC m=+87.220529027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w9vw5" (UID: "84d158a6-dfbe-407b-be72-b82ba7380fd8") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:27:59.118746 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.118654 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.118746 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.118675 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b49jg\" (UniqueName: \"kubernetes.io/projected/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-kube-api-access-b49jg\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.119068 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.119050 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/84d158a6-dfbe-407b-be72-b82ba7380fd8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:59.126328 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.126305 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwxnz\" (UniqueName: \"kubernetes.io/projected/84d158a6-dfbe-407b-be72-b82ba7380fd8-kube-api-access-rwxnz\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:59.219547 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.219513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.219647 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.219623 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:27:59.219647 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.219633 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b49jg\" (UniqueName: \"kubernetes.io/projected/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-kube-api-access-b49jg\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.219729 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.219680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-default-certificate\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.219783 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.219749 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:27:59.719721091 +0000 UTC m=+87.321640184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : secret "router-metrics-certs-default" not found Apr 16 23:27:59.219844 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.219782 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-stats-auth\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.219844 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.219821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.219992 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.219979 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:27:59.719967362 +0000 UTC m=+87.321886442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : configmap references non-existent config key: service-ca.crt Apr 16 23:27:59.222393 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.222371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-stats-auth\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.222468 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.222407 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-default-certificate\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.229857 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.229838 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b49jg\" (UniqueName: \"kubernetes.io/projected/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-kube-api-access-b49jg\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.622192 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.622168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:27:59.622295 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.622243 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:27:59.622295 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.622286 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls podName:84d158a6-dfbe-407b-be72-b82ba7380fd8 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:00.622274382 +0000 UTC m=+88.224193461 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w9vw5" (UID: "84d158a6-dfbe-407b-be72-b82ba7380fd8") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:27:59.723188 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.723163 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.723273 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:27:59.723253 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:27:59.723316 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.723293 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:27:59.723371 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.723360 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:00.723343411 +0000 UTC m=+88.325262497 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : secret "router-metrics-certs-default" not found Apr 16 23:27:59.723412 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:27:59.723376 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:00.723369857 +0000 UTC m=+88.325288935 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : configmap references non-existent config key: service-ca.crt Apr 16 23:28:00.630905 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:00.630863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:28:00.631309 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:00.631026 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:28:00.631309 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:00.631099 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls podName:84d158a6-dfbe-407b-be72-b82ba7380fd8 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:02.631080965 +0000 UTC m=+90.233000044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w9vw5" (UID: "84d158a6-dfbe-407b-be72-b82ba7380fd8") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:28:00.732227 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:00.732198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:00.732359 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:00.732241 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:00.732359 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:00.732334 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:28:00.732431 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:00.732359 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:02.732342361 +0000 UTC m=+90.334261441 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : configmap references non-existent config key: service-ca.crt Apr 16 23:28:00.732431 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:00.732382 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:02.732374972 +0000 UTC m=+90.334294051 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : secret "router-metrics-certs-default" not found Apr 16 23:28:02.646444 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:02.646407 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:28:02.646852 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:02.646525 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:28:02.646852 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:02.646606 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls podName:84d158a6-dfbe-407b-be72-b82ba7380fd8 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:06.64659188 +0000 UTC m=+94.248510960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w9vw5" (UID: "84d158a6-dfbe-407b-be72-b82ba7380fd8") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:28:02.747124 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:02.747083 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:02.747124 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:02.747136 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:02.747302 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:02.747221 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:28:02.747302 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:02.747258 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:06.747239749 +0000 UTC m=+94.349158841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : configmap references non-existent config key: service-ca.crt Apr 16 23:28:02.747302 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:02.747287 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:06.747279316 +0000 UTC m=+94.349198394 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : secret "router-metrics-certs-default" not found Apr 16 23:28:04.516884 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:04.516854 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m8gnt_a40691e1-4691-4b8d-b935-ff781629806d/dns-node-resolver/0.log" Apr 16 23:28:05.317003 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:05.316971 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kt5t6_e24fef17-a7c3-497e-a65f-9458686c8ea2/node-ca/0.log" Apr 16 23:28:06.674075 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:06.674026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:28:06.674507 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:06.674159 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:28:06.674507 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:06.674220 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls podName:84d158a6-dfbe-407b-be72-b82ba7380fd8 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:14.674204483 +0000 UTC m=+102.276123563 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w9vw5" (UID: "84d158a6-dfbe-407b-be72-b82ba7380fd8") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:28:06.774838 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:06.774808 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:06.774929 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:06.774908 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:06.774982 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:06.774961 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:28:06.775034 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:06.775022 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:14.775006556 +0000 UTC m=+102.376925641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : secret "router-metrics-certs-default" not found Apr 16 23:28:06.775074 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:06.775041 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:14.775035024 +0000 UTC m=+102.376954103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : configmap references non-existent config key: service-ca.crt Apr 16 23:28:07.860146 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.860112 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm"] Apr 16 23:28:07.863171 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.863149 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv"] Apr 16 23:28:07.863323 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.863302 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:07.865639 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.865609 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 23:28:07.865639 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.865625 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-84r5q\"" Apr 16 23:28:07.865812 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.865646 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 23:28:07.865812 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.865652 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:28:07.866064 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.866049 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv" Apr 16 23:28:07.867878 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.867858 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-8w6zx\"" Apr 16 23:28:07.868000 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.867988 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 23:28:07.868099 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.868079 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:28:07.873286 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.873250 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm"] Apr 16 23:28:07.885946 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.885927 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv"] Apr 16 23:28:07.981840 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.981817 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:07.981935 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.981860 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqlrd\" (UniqueName: \"kubernetes.io/projected/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-kube-api-access-vqlrd\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:07.981980 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:07.981947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gq98\" (UniqueName: \"kubernetes.io/projected/c00f22ca-f714-4922-9bff-d7d78acd7194-kube-api-access-6gq98\") pod \"volume-data-source-validator-7c6cbb6c87-tjplv\" (UID: \"c00f22ca-f714-4922-9bff-d7d78acd7194\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv" Apr 16 23:28:08.082183 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.082159 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:08.082279 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.082196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqlrd\" (UniqueName: \"kubernetes.io/projected/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-kube-api-access-vqlrd\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:08.082279 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.082242 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gq98\" (UniqueName: \"kubernetes.io/projected/c00f22ca-f714-4922-9bff-d7d78acd7194-kube-api-access-6gq98\") pod \"volume-data-source-validator-7c6cbb6c87-tjplv\" (UID: \"c00f22ca-f714-4922-9bff-d7d78acd7194\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv" Apr 16 23:28:08.082410 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:08.082303 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:28:08.082410 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:08.082369 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls podName:ad3e49ef-6cc9-41e0-8311-05eb71f392d5 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:08.582349428 +0000 UTC m=+96.184268513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-s8sqm" (UID: "ad3e49ef-6cc9-41e0-8311-05eb71f392d5") : secret "samples-operator-tls" not found Apr 16 23:28:08.090631 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.090603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqlrd\" (UniqueName: \"kubernetes.io/projected/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-kube-api-access-vqlrd\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:08.090716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.090677 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gq98\" (UniqueName: \"kubernetes.io/projected/c00f22ca-f714-4922-9bff-d7d78acd7194-kube-api-access-6gq98\") pod \"volume-data-source-validator-7c6cbb6c87-tjplv\" (UID: \"c00f22ca-f714-4922-9bff-d7d78acd7194\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv" Apr 16 23:28:08.179831 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.179812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv" Apr 16 23:28:08.287559 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.287517 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv"] Apr 16 23:28:08.291606 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:08.291582 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc00f22ca_f714_4922_9bff_d7d78acd7194.slice/crio-91bc7bf1711c559222551ac10b33e98a85176e76470ed884d910c2221f850569 WatchSource:0}: Error finding container 91bc7bf1711c559222551ac10b33e98a85176e76470ed884d910c2221f850569: Status 404 returned error can't find the container with id 91bc7bf1711c559222551ac10b33e98a85176e76470ed884d910c2221f850569 Apr 16 23:28:08.324348 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.324323 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv" event={"ID":"c00f22ca-f714-4922-9bff-d7d78acd7194","Type":"ContainerStarted","Data":"91bc7bf1711c559222551ac10b33e98a85176e76470ed884d910c2221f850569"} Apr 16 23:28:08.586521 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.586463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:08.586641 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:08.586623 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:28:08.586694 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:08.586684 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls podName:ad3e49ef-6cc9-41e0-8311-05eb71f392d5 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:09.586668789 +0000 UTC m=+97.188587868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-s8sqm" (UID: "ad3e49ef-6cc9-41e0-8311-05eb71f392d5") : secret "samples-operator-tls" not found Apr 16 23:28:08.966107 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.966077 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4"] Apr 16 23:28:08.970287 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.970043 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:08.972272 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.972247 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-4g2xq\"" Apr 16 23:28:08.972386 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.972289 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 23:28:08.972970 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.972951 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:28:08.973081 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.972987 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 23:28:08.973081 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.972951 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 23:28:08.976416 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:08.976392 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4"] Apr 16 23:28:09.090329 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.090297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65533e30-6d62-4a07-b42f-3d2ebe34b87f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g6wh4\" (UID: \"65533e30-6d62-4a07-b42f-3d2ebe34b87f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.090462 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.090341 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65533e30-6d62-4a07-b42f-3d2ebe34b87f-config\") pod \"service-ca-operator-d6fc45fc5-g6wh4\" (UID: \"65533e30-6d62-4a07-b42f-3d2ebe34b87f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.090512 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.090480 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:28:09.090595 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.090516 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5pb4\" (UniqueName: \"kubernetes.io/projected/65533e30-6d62-4a07-b42f-3d2ebe34b87f-kube-api-access-x5pb4\") pod \"service-ca-operator-d6fc45fc5-g6wh4\" (UID: \"65533e30-6d62-4a07-b42f-3d2ebe34b87f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.090595 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.090567 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:28:09.090693 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:09.090639 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 23:28:09.090693 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:09.090663 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 23:28:09.090773 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:09.090710 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls podName:fbf7e722-63c3-4ad8-b126-d39966fa38f3 nodeName:}" failed. No retries permitted until 2026-04-16 23:29:13.090694321 +0000 UTC m=+160.692613414 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls") pod "dns-default-2cxtr" (UID: "fbf7e722-63c3-4ad8-b126-d39966fa38f3") : secret "dns-default-metrics-tls" not found Apr 16 23:28:09.090773 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:09.090727 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert podName:9ee4f424-f877-4116-acf8-e4a8c70b5329 nodeName:}" failed. No retries permitted until 2026-04-16 23:29:13.090718029 +0000 UTC m=+160.692637107 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert") pod "ingress-canary-fdk7d" (UID: "9ee4f424-f877-4116-acf8-e4a8c70b5329") : secret "canary-serving-cert" not found Apr 16 23:28:09.191335 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.191313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5pb4\" (UniqueName: \"kubernetes.io/projected/65533e30-6d62-4a07-b42f-3d2ebe34b87f-kube-api-access-x5pb4\") pod \"service-ca-operator-d6fc45fc5-g6wh4\" (UID: \"65533e30-6d62-4a07-b42f-3d2ebe34b87f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.191455 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.191366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65533e30-6d62-4a07-b42f-3d2ebe34b87f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g6wh4\" (UID: \"65533e30-6d62-4a07-b42f-3d2ebe34b87f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.191455 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.191390 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65533e30-6d62-4a07-b42f-3d2ebe34b87f-config\") pod \"service-ca-operator-d6fc45fc5-g6wh4\" (UID: \"65533e30-6d62-4a07-b42f-3d2ebe34b87f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.191835 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.191814 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65533e30-6d62-4a07-b42f-3d2ebe34b87f-config\") pod \"service-ca-operator-d6fc45fc5-g6wh4\" (UID: \"65533e30-6d62-4a07-b42f-3d2ebe34b87f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.193373 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.193354 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65533e30-6d62-4a07-b42f-3d2ebe34b87f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-g6wh4\" (UID: \"65533e30-6d62-4a07-b42f-3d2ebe34b87f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.198064 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.198040 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5pb4\" (UniqueName: \"kubernetes.io/projected/65533e30-6d62-4a07-b42f-3d2ebe34b87f-kube-api-access-x5pb4\") pod \"service-ca-operator-d6fc45fc5-g6wh4\" (UID: \"65533e30-6d62-4a07-b42f-3d2ebe34b87f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.281190 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.281142 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" Apr 16 23:28:09.399987 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.399959 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4"] Apr 16 23:28:09.403752 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:09.403722 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65533e30_6d62_4a07_b42f_3d2ebe34b87f.slice/crio-38d78a4e8f9f21152691039c1e2e9f4d066f482b9ca1a0e254e540772f4fe6f4 WatchSource:0}: Error finding container 38d78a4e8f9f21152691039c1e2e9f4d066f482b9ca1a0e254e540772f4fe6f4: Status 404 returned error can't find the container with id 38d78a4e8f9f21152691039c1e2e9f4d066f482b9ca1a0e254e540772f4fe6f4 Apr 16 23:28:09.594035 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:09.593957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:09.594177 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:09.594109 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:28:09.594221 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:09.594183 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls podName:ad3e49ef-6cc9-41e0-8311-05eb71f392d5 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:11.594166188 +0000 UTC m=+99.196085267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-s8sqm" (UID: "ad3e49ef-6cc9-41e0-8311-05eb71f392d5") : secret "samples-operator-tls" not found Apr 16 23:28:10.329387 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:10.329354 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" event={"ID":"65533e30-6d62-4a07-b42f-3d2ebe34b87f","Type":"ContainerStarted","Data":"38d78a4e8f9f21152691039c1e2e9f4d066f482b9ca1a0e254e540772f4fe6f4"} Apr 16 23:28:10.330826 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:10.330799 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv" event={"ID":"c00f22ca-f714-4922-9bff-d7d78acd7194","Type":"ContainerStarted","Data":"fff9367692f8ecc1476e35fe91b6ec639970328a5eae6253d87eb780f59a2142"} Apr 16 23:28:10.344497 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:10.344361 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-tjplv" podStartSLOduration=1.883328982 podStartE2EDuration="3.34434436s" podCreationTimestamp="2026-04-16 23:28:07 +0000 UTC" firstStartedPulling="2026-04-16 23:28:08.293672012 +0000 UTC m=+95.895591092" lastFinishedPulling="2026-04-16 23:28:09.754687391 +0000 UTC m=+97.356606470" observedRunningTime="2026-04-16 23:28:10.344009402 +0000 UTC m=+97.945928505" watchObservedRunningTime="2026-04-16 23:28:10.34434436 +0000 UTC m=+97.946263461" Apr 16 23:28:11.609610 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:11.609564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:11.610019 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:11.609718 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:28:11.610019 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:11.609799 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls podName:ad3e49ef-6cc9-41e0-8311-05eb71f392d5 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:15.609777842 +0000 UTC m=+103.211696944 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-s8sqm" (UID: "ad3e49ef-6cc9-41e0-8311-05eb71f392d5") : secret "samples-operator-tls" not found Apr 16 23:28:12.335191 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:12.335161 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" event={"ID":"65533e30-6d62-4a07-b42f-3d2ebe34b87f","Type":"ContainerStarted","Data":"d5b1c6156f7b746eee080729cfad0d060b3db2ccfc5af285c220535b87ffe01e"} Apr 16 23:28:12.348823 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:12.348772 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" podStartSLOduration=1.8474941870000001 podStartE2EDuration="4.34875814s" podCreationTimestamp="2026-04-16 23:28:08 +0000 UTC" firstStartedPulling="2026-04-16 23:28:09.405757305 +0000 UTC m=+97.007676385" lastFinishedPulling="2026-04-16 23:28:11.907021259 +0000 UTC m=+99.508940338" observedRunningTime="2026-04-16 23:28:12.34836221 +0000 UTC m=+99.950281320" watchObservedRunningTime="2026-04-16 23:28:12.34875814 +0000 UTC m=+99.950677240" Apr 16 23:28:13.278607 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:13.278577 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-svp68" Apr 16 23:28:14.730623 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:14.730591 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:28:14.731065 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:14.730739 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 23:28:14.731065 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:14.730828 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls podName:84d158a6-dfbe-407b-be72-b82ba7380fd8 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:30.730807068 +0000 UTC m=+118.332726152 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-w9vw5" (UID: "84d158a6-dfbe-407b-be72-b82ba7380fd8") : secret "cluster-monitoring-operator-tls" not found Apr 16 23:28:14.831851 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:14.831827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:14.831959 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:14.831863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:14.831997 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:14.831970 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:30.831955664 +0000 UTC m=+118.433874743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : configmap references non-existent config key: service-ca.crt Apr 16 23:28:14.831997 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:14.831986 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 23:28:14.832064 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:14.832031 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs podName:99c15427-bc7a-4ab4-b1c3-c425e97bf63c nodeName:}" failed. No retries permitted until 2026-04-16 23:28:30.832018096 +0000 UTC m=+118.433937174 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs") pod "router-default-5897bbd496-f9ftq" (UID: "99c15427-bc7a-4ab4-b1c3-c425e97bf63c") : secret "router-metrics-certs-default" not found Apr 16 23:28:15.374796 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.374766 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2kq9p"] Apr 16 23:28:15.393942 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.393919 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2kq9p"] Apr 16 23:28:15.394073 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.393948 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.396035 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.396010 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 23:28:15.396147 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.396023 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-g7897\"" Apr 16 23:28:15.396736 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.396718 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 23:28:15.396917 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.396898 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 23:28:15.396988 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.396943 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 23:28:15.436207 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.436188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b1361a22-1f86-4464-aaff-3d27c4f9c94b-signing-cabundle\") pod \"service-ca-865cb79987-2kq9p\" (UID: \"b1361a22-1f86-4464-aaff-3d27c4f9c94b\") " pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.436311 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.436215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8mtw\" (UniqueName: \"kubernetes.io/projected/b1361a22-1f86-4464-aaff-3d27c4f9c94b-kube-api-access-x8mtw\") pod \"service-ca-865cb79987-2kq9p\" (UID: \"b1361a22-1f86-4464-aaff-3d27c4f9c94b\") " pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.436311 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.436288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b1361a22-1f86-4464-aaff-3d27c4f9c94b-signing-key\") pod \"service-ca-865cb79987-2kq9p\" (UID: \"b1361a22-1f86-4464-aaff-3d27c4f9c94b\") " pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.537041 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.537018 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b1361a22-1f86-4464-aaff-3d27c4f9c94b-signing-key\") pod \"service-ca-865cb79987-2kq9p\" (UID: \"b1361a22-1f86-4464-aaff-3d27c4f9c94b\") " pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.537174 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.537075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b1361a22-1f86-4464-aaff-3d27c4f9c94b-signing-cabundle\") pod \"service-ca-865cb79987-2kq9p\" (UID: \"b1361a22-1f86-4464-aaff-3d27c4f9c94b\") " pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.537174 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.537095 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8mtw\" (UniqueName: \"kubernetes.io/projected/b1361a22-1f86-4464-aaff-3d27c4f9c94b-kube-api-access-x8mtw\") pod \"service-ca-865cb79987-2kq9p\" (UID: \"b1361a22-1f86-4464-aaff-3d27c4f9c94b\") " pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.538207 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.538191 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/b1361a22-1f86-4464-aaff-3d27c4f9c94b-signing-cabundle\") pod \"service-ca-865cb79987-2kq9p\" (UID: \"b1361a22-1f86-4464-aaff-3d27c4f9c94b\") " pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.539408 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.539388 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/b1361a22-1f86-4464-aaff-3d27c4f9c94b-signing-key\") pod \"service-ca-865cb79987-2kq9p\" (UID: \"b1361a22-1f86-4464-aaff-3d27c4f9c94b\") " pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.545464 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.545444 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8mtw\" (UniqueName: \"kubernetes.io/projected/b1361a22-1f86-4464-aaff-3d27c4f9c94b-kube-api-access-x8mtw\") pod \"service-ca-865cb79987-2kq9p\" (UID: \"b1361a22-1f86-4464-aaff-3d27c4f9c94b\") " pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.637689 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.637639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:15.637786 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:15.637770 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 23:28:15.637842 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:15.637833 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls podName:ad3e49ef-6cc9-41e0-8311-05eb71f392d5 nodeName:}" failed. No retries permitted until 2026-04-16 23:28:23.637817391 +0000 UTC m=+111.239736480 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-s8sqm" (UID: "ad3e49ef-6cc9-41e0-8311-05eb71f392d5") : secret "samples-operator-tls" not found Apr 16 23:28:15.703099 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.703072 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-2kq9p" Apr 16 23:28:15.752865 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.752836 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xdd9t"] Apr 16 23:28:15.756989 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.756971 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.759018 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.758994 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 23:28:15.759167 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.759147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 23:28:15.759435 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.759410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 23:28:15.759560 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.759498 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 23:28:15.759829 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.759662 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jrpnr\"" Apr 16 23:28:15.764147 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.764124 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xdd9t"] Apr 16 23:28:15.817372 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.817346 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-2kq9p"] Apr 16 23:28:15.821244 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:15.821216 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1361a22_1f86_4464_aaff_3d27c4f9c94b.slice/crio-654c156d1f029bedf78d56c097570a65438bb1f87c0317a8d0494b1aa8e10290 WatchSource:0}: Error finding container 654c156d1f029bedf78d56c097570a65438bb1f87c0317a8d0494b1aa8e10290: Status 404 returned error can't find the container with id 654c156d1f029bedf78d56c097570a65438bb1f87c0317a8d0494b1aa8e10290 Apr 16 23:28:15.840076 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.840058 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.840163 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.840092 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjjfm\" (UniqueName: \"kubernetes.io/projected/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-kube-api-access-zjjfm\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.840220 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.840170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.840303 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.840248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-data-volume\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.840339 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.840300 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-crio-socket\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.940922 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.940894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.940922 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.940925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjjfm\" (UniqueName: \"kubernetes.io/projected/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-kube-api-access-zjjfm\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.941098 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.941066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.941158 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.941143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-data-volume\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.941210 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.941163 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-crio-socket\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.941210 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:15.941191 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:15.941306 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:15.941247 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls podName:6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b nodeName:}" failed. No retries permitted until 2026-04-16 23:28:16.441228383 +0000 UTC m=+104.043147469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xdd9t" (UID: "6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:15.941306 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.941289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-crio-socket\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.941491 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.941430 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-data-volume\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.941624 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.941504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:15.949144 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:15.949124 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjjfm\" (UniqueName: \"kubernetes.io/projected/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-kube-api-access-zjjfm\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:16.347036 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:16.347003 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2kq9p" event={"ID":"b1361a22-1f86-4464-aaff-3d27c4f9c94b","Type":"ContainerStarted","Data":"4f9a7bdde57077a8182ae0054613a1ef01057662a4d56817c3b00508888eec2f"} Apr 16 23:28:16.347036 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:16.347039 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-2kq9p" event={"ID":"b1361a22-1f86-4464-aaff-3d27c4f9c94b","Type":"ContainerStarted","Data":"654c156d1f029bedf78d56c097570a65438bb1f87c0317a8d0494b1aa8e10290"} Apr 16 23:28:16.363944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:16.363898 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-2kq9p" podStartSLOduration=1.363882882 podStartE2EDuration="1.363882882s" podCreationTimestamp="2026-04-16 23:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:28:16.362718729 +0000 UTC m=+103.964637830" watchObservedRunningTime="2026-04-16 23:28:16.363882882 +0000 UTC m=+103.965801985" Apr 16 23:28:16.444255 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:16.444230 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:16.444396 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:16.444378 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:16.444456 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:16.444446 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls podName:6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b nodeName:}" failed. No retries permitted until 2026-04-16 23:28:17.444429076 +0000 UTC m=+105.046348162 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xdd9t" (UID: "6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:17.452302 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:17.452257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:17.452972 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:17.452411 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:17.452972 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:17.452904 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls podName:6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b nodeName:}" failed. No retries permitted until 2026-04-16 23:28:19.452879483 +0000 UTC m=+107.054798646 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xdd9t" (UID: "6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:19.468150 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:19.468112 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:19.468628 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:19.468279 2573 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 23:28:19.468628 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:19.468364 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls podName:6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b nodeName:}" failed. No retries permitted until 2026-04-16 23:28:23.468342966 +0000 UTC m=+111.070262046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xdd9t" (UID: "6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b") : secret "insights-runtime-extractor-tls" not found Apr 16 23:28:23.500202 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:23.500157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:23.502404 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:23.502382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xdd9t\" (UID: \"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b\") " pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:23.569932 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:23.569906 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xdd9t" Apr 16 23:28:23.683458 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:23.683430 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xdd9t"] Apr 16 23:28:23.686499 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:23.686473 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7dbba2_9f6d_49ca_be01_6f1dcc09a72b.slice/crio-741d3114f1d493bb1895935c2fd90d3c84b403f566b817fb02297bd1f43b735a WatchSource:0}: Error finding container 741d3114f1d493bb1895935c2fd90d3c84b403f566b817fb02297bd1f43b735a: Status 404 returned error can't find the container with id 741d3114f1d493bb1895935c2fd90d3c84b403f566b817fb02297bd1f43b735a Apr 16 23:28:23.702159 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:23.702130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:23.704806 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:23.704786 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3e49ef-6cc9-41e0-8311-05eb71f392d5-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-s8sqm\" (UID: \"ad3e49ef-6cc9-41e0-8311-05eb71f392d5\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:23.774572 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:23.774511 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" Apr 16 23:28:23.879626 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:23.879601 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm"] Apr 16 23:28:24.363360 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:24.363335 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" event={"ID":"ad3e49ef-6cc9-41e0-8311-05eb71f392d5","Type":"ContainerStarted","Data":"3d48519909d4179a2d14f874ba2c4ddff96f32454c1d3de0f1f02429d695589c"} Apr 16 23:28:24.364527 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:24.364505 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xdd9t" event={"ID":"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b","Type":"ContainerStarted","Data":"9158be050db6b4feb6f2449dedfcba0e75166a13b438070e9a882e2e2e205aeb"} Apr 16 23:28:24.364636 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:24.364553 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xdd9t" event={"ID":"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b","Type":"ContainerStarted","Data":"741d3114f1d493bb1895935c2fd90d3c84b403f566b817fb02297bd1f43b735a"} Apr 16 23:28:25.371147 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:25.371108 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xdd9t" event={"ID":"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b","Type":"ContainerStarted","Data":"44c4f4e3e44952b69b626ba264e242736392c0dd33db39fea4f994363fb0252a"} Apr 16 23:28:26.374374 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:26.374343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" event={"ID":"ad3e49ef-6cc9-41e0-8311-05eb71f392d5","Type":"ContainerStarted","Data":"5b7276954196fde3479d04efed82bb42b6f925af6efbfc169d96c8f97cbb03ef"} Apr 16 23:28:26.379880 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:26.379842 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xdd9t" event={"ID":"6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b","Type":"ContainerStarted","Data":"9d288fb4575808f225926d0bb36b146d580f29af9c37358054e4831e510690ab"} Apr 16 23:28:26.397082 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:26.396939 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xdd9t" podStartSLOduration=8.864863521 podStartE2EDuration="11.396921495s" podCreationTimestamp="2026-04-16 23:28:15 +0000 UTC" firstStartedPulling="2026-04-16 23:28:23.742016085 +0000 UTC m=+111.343935177" lastFinishedPulling="2026-04-16 23:28:26.274074072 +0000 UTC m=+113.875993151" observedRunningTime="2026-04-16 23:28:26.396092715 +0000 UTC m=+113.998011818" watchObservedRunningTime="2026-04-16 23:28:26.396921495 +0000 UTC m=+113.998840598" Apr 16 23:28:27.383204 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:27.383168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" event={"ID":"ad3e49ef-6cc9-41e0-8311-05eb71f392d5","Type":"ContainerStarted","Data":"1a43a68734f2c19b8f773d212885972f2c2ed4fffd90978b0ae4d6d74b339be4"} Apr 16 23:28:27.398361 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:27.398319 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-s8sqm" podStartSLOduration=18.038268359 podStartE2EDuration="20.398306136s" podCreationTimestamp="2026-04-16 23:28:07 +0000 UTC" firstStartedPulling="2026-04-16 23:28:23.912157435 +0000 UTC m=+111.514076514" lastFinishedPulling="2026-04-16 23:28:26.2721952 +0000 UTC m=+113.874114291" observedRunningTime="2026-04-16 23:28:27.397685338 +0000 UTC m=+114.999604436" watchObservedRunningTime="2026-04-16 23:28:27.398306136 +0000 UTC m=+115.000225237" Apr 16 23:28:30.755104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:30.755063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:28:30.757652 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:30.757626 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/84d158a6-dfbe-407b-be72-b82ba7380fd8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-w9vw5\" (UID: \"84d158a6-dfbe-407b-be72-b82ba7380fd8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:28:30.799806 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:30.799776 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" Apr 16 23:28:30.855515 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:30.855481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:30.855645 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:30.855629 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:30.856136 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:30.856110 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-service-ca-bundle\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:30.858240 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:30.858216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/99c15427-bc7a-4ab4-b1c3-c425e97bf63c-metrics-certs\") pod \"router-default-5897bbd496-f9ftq\" (UID: \"99c15427-bc7a-4ab4-b1c3-c425e97bf63c\") " pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:30.899005 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:30.898980 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:30.912656 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:30.912635 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5"] Apr 16 23:28:30.915390 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:30.915364 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d158a6_dfbe_407b_be72_b82ba7380fd8.slice/crio-a70110e29862e6a7d21a3edfb5a84fa2fea7463458aa4895e02d71c254828c2e WatchSource:0}: Error finding container a70110e29862e6a7d21a3edfb5a84fa2fea7463458aa4895e02d71c254828c2e: Status 404 returned error can't find the container with id a70110e29862e6a7d21a3edfb5a84fa2fea7463458aa4895e02d71c254828c2e Apr 16 23:28:31.023735 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:31.023677 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5897bbd496-f9ftq"] Apr 16 23:28:31.026263 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:31.026239 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99c15427_bc7a_4ab4_b1c3_c425e97bf63c.slice/crio-1093d2712e9f70e429f0f54049e0bcfcaa4d922df36b376c5d59e0621e09d2a5 WatchSource:0}: Error finding container 1093d2712e9f70e429f0f54049e0bcfcaa4d922df36b376c5d59e0621e09d2a5: Status 404 returned error can't find the container with id 1093d2712e9f70e429f0f54049e0bcfcaa4d922df36b376c5d59e0621e09d2a5 Apr 16 23:28:31.395254 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:31.395165 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5897bbd496-f9ftq" event={"ID":"99c15427-bc7a-4ab4-b1c3-c425e97bf63c","Type":"ContainerStarted","Data":"b12257273c82ce33c3bbf2a00fb7c53f68f604e0666a81567aa3e0e8662ba2ec"} Apr 16 23:28:31.395254 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:31.395204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5897bbd496-f9ftq" event={"ID":"99c15427-bc7a-4ab4-b1c3-c425e97bf63c","Type":"ContainerStarted","Data":"1093d2712e9f70e429f0f54049e0bcfcaa4d922df36b376c5d59e0621e09d2a5"} Apr 16 23:28:31.396151 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:31.396124 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" event={"ID":"84d158a6-dfbe-407b-be72-b82ba7380fd8","Type":"ContainerStarted","Data":"a70110e29862e6a7d21a3edfb5a84fa2fea7463458aa4895e02d71c254828c2e"} Apr 16 23:28:31.411970 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:31.411919 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5897bbd496-f9ftq" podStartSLOduration=32.411905152 podStartE2EDuration="32.411905152s" podCreationTimestamp="2026-04-16 23:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:28:31.411111865 +0000 UTC m=+119.013030966" watchObservedRunningTime="2026-04-16 23:28:31.411905152 +0000 UTC m=+119.013824253" Apr 16 23:28:31.900086 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:31.900053 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:31.902315 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:31.902295 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:32.398507 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:32.398471 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:32.399814 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:32.399789 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5897bbd496-f9ftq" Apr 16 23:28:33.401396 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.401356 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" event={"ID":"84d158a6-dfbe-407b-be72-b82ba7380fd8","Type":"ContainerStarted","Data":"359229287d5d59e690681efe9ca6820c6b798cf44daa20a5233c15d903a565dc"} Apr 16 23:28:33.416597 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.416558 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-w9vw5" podStartSLOduration=33.488072992 podStartE2EDuration="35.416527835s" podCreationTimestamp="2026-04-16 23:27:58 +0000 UTC" firstStartedPulling="2026-04-16 23:28:30.917259723 +0000 UTC m=+118.519178802" lastFinishedPulling="2026-04-16 23:28:32.845714565 +0000 UTC m=+120.447633645" observedRunningTime="2026-04-16 23:28:33.41535534 +0000 UTC m=+121.017274441" watchObservedRunningTime="2026-04-16 23:28:33.416527835 +0000 UTC m=+121.018446935" Apr 16 23:28:33.909095 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.909061 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8"] Apr 16 23:28:33.912207 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.912184 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" Apr 16 23:28:33.914974 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.914948 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-mhvcv\"" Apr 16 23:28:33.914974 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.914968 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 23:28:33.920712 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.920683 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8"] Apr 16 23:28:33.933138 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.933115 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7f59d69d8f-zflqg"] Apr 16 23:28:33.936366 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.936344 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:33.938572 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.938552 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 23:28:33.938689 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.938670 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 23:28:33.938757 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.938573 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 23:28:33.938841 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.938819 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xwc2q\"" Apr 16 23:28:33.944276 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.944257 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 23:28:33.948622 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.948603 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f59d69d8f-zflqg"] Apr 16 23:28:33.982768 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:33.982740 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dcf7a7a8-e110-4fa0-a351-479b0d42756c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-s8vl8\" (UID: \"dcf7a7a8-e110-4fa0-a351-479b0d42756c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" Apr 16 23:28:34.083873 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.083846 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-bound-sa-token\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.083968 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.083881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-image-registry-private-configuration\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.083968 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.083924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-registry-certificates\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.083968 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.083964 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-ca-trust-extracted\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.084082 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.083981 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-installation-pull-secrets\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.084082 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.084008 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dcf7a7a8-e110-4fa0-a351-479b0d42756c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-s8vl8\" (UID: \"dcf7a7a8-e110-4fa0-a351-479b0d42756c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" Apr 16 23:28:34.084082 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.084030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-registry-tls\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.084082 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.084053 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm2k5\" (UniqueName: \"kubernetes.io/projected/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-kube-api-access-dm2k5\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.084082 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.084076 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-trusted-ca\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.086224 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.086206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/dcf7a7a8-e110-4fa0-a351-479b0d42756c-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-s8vl8\" (UID: \"dcf7a7a8-e110-4fa0-a351-479b0d42756c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" Apr 16 23:28:34.184448 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.184421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-registry-certificates\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.184623 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.184467 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-ca-trust-extracted\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.184623 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.184495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-installation-pull-secrets\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.184623 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.184522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-registry-tls\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.184623 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.184564 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dm2k5\" (UniqueName: \"kubernetes.io/projected/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-kube-api-access-dm2k5\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.184623 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.184583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-trusted-ca\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.184868 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.184676 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-bound-sa-token\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.184868 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.184726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-image-registry-private-configuration\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.184964 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.184881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-ca-trust-extracted\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.185414 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.185391 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-registry-certificates\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.185683 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.185660 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-trusted-ca\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.186833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.186808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-installation-pull-secrets\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.187038 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.187020 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-image-registry-private-configuration\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.187435 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.187420 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-registry-tls\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.194587 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.194515 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-bound-sa-token\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.194587 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.194518 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm2k5\" (UniqueName: \"kubernetes.io/projected/b7bb1187-a715-4b7a-aa6e-0dc183dc753d-kube-api-access-dm2k5\") pod \"image-registry-7f59d69d8f-zflqg\" (UID: \"b7bb1187-a715-4b7a-aa6e-0dc183dc753d\") " pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.222826 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.222807 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" Apr 16 23:28:34.246504 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.246485 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:34.341973 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.341936 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8"] Apr 16 23:28:34.346699 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:34.346673 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf7a7a8_e110_4fa0_a351_479b0d42756c.slice/crio-ee71c1ca5688952bb99aa7f9cedaa5aeb4bac206afab4f8c2cbd92fa168fa643 WatchSource:0}: Error finding container ee71c1ca5688952bb99aa7f9cedaa5aeb4bac206afab4f8c2cbd92fa168fa643: Status 404 returned error can't find the container with id ee71c1ca5688952bb99aa7f9cedaa5aeb4bac206afab4f8c2cbd92fa168fa643 Apr 16 23:28:34.365313 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.365287 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7f59d69d8f-zflqg"] Apr 16 23:28:34.368500 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:34.368472 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7bb1187_a715_4b7a_aa6e_0dc183dc753d.slice/crio-2b98d7ec67af5f07d6f0d4eb4036ab20b77de4a2b496222f30abf115e19abe8b WatchSource:0}: Error finding container 2b98d7ec67af5f07d6f0d4eb4036ab20b77de4a2b496222f30abf115e19abe8b: Status 404 returned error can't find the container with id 2b98d7ec67af5f07d6f0d4eb4036ab20b77de4a2b496222f30abf115e19abe8b Apr 16 23:28:34.404189 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.404166 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" event={"ID":"b7bb1187-a715-4b7a-aa6e-0dc183dc753d","Type":"ContainerStarted","Data":"2b98d7ec67af5f07d6f0d4eb4036ab20b77de4a2b496222f30abf115e19abe8b"} Apr 16 23:28:34.405094 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:34.405075 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" event={"ID":"dcf7a7a8-e110-4fa0-a351-479b0d42756c","Type":"ContainerStarted","Data":"ee71c1ca5688952bb99aa7f9cedaa5aeb4bac206afab4f8c2cbd92fa168fa643"} Apr 16 23:28:35.409876 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:35.409839 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" event={"ID":"b7bb1187-a715-4b7a-aa6e-0dc183dc753d","Type":"ContainerStarted","Data":"cdaac6bba7e91ba450f5b8b1411664c5897a06f425092a0897352e0884edf060"} Apr 16 23:28:35.410262 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:35.409965 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:28:36.414272 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:36.414238 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" event={"ID":"dcf7a7a8-e110-4fa0-a351-479b0d42756c","Type":"ContainerStarted","Data":"2c09f2d064b686d4b866875ffecf496757c107da8d6c448123b8914d590d1a06"} Apr 16 23:28:36.414853 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:36.414490 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" Apr 16 23:28:36.419385 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:36.419351 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" Apr 16 23:28:36.428236 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:36.428190 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-s8vl8" podStartSLOduration=2.266034535 podStartE2EDuration="3.428178556s" podCreationTimestamp="2026-04-16 23:28:33 +0000 UTC" firstStartedPulling="2026-04-16 23:28:34.348469742 +0000 UTC m=+121.950388820" lastFinishedPulling="2026-04-16 23:28:35.510613762 +0000 UTC m=+123.112532841" observedRunningTime="2026-04-16 23:28:36.427604961 +0000 UTC m=+124.029524067" watchObservedRunningTime="2026-04-16 23:28:36.428178556 +0000 UTC m=+124.030097657" Apr 16 23:28:36.428487 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:36.428466 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" podStartSLOduration=3.428461539 podStartE2EDuration="3.428461539s" podCreationTimestamp="2026-04-16 23:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:28:35.427908081 +0000 UTC m=+123.029827179" watchObservedRunningTime="2026-04-16 23:28:36.428461539 +0000 UTC m=+124.030380639" Apr 16 23:28:41.691630 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.691575 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7h258"] Apr 16 23:28:41.696012 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.695986 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.698463 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.698434 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 23:28:41.698619 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.698597 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sfntm\"" Apr 16 23:28:41.698771 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.698447 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 23:28:41.698872 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.698803 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 23:28:41.699130 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.699112 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 23:28:41.742248 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.742223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:28:41.744689 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.744666 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ed66159-86cf-4f43-824b-3905a5019c1c-metrics-certs\") pod \"network-metrics-daemon-qhz5v\" (UID: \"3ed66159-86cf-4f43-824b-3905a5019c1c\") " pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:28:41.842756 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.842732 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a216be1-f2fc-4496-b025-c083bf935ba0-root\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.842854 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.842765 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-wtmp\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.842854 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.842798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.842943 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.842847 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8pjn\" (UniqueName: \"kubernetes.io/projected/4a216be1-f2fc-4496-b025-c083bf935ba0-kube-api-access-l8pjn\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.842988 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.842968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a216be1-f2fc-4496-b025-c083bf935ba0-sys\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.843035 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.843017 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.843078 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.843052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-textfile\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.843121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.843077 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a216be1-f2fc-4496-b025-c083bf935ba0-metrics-client-ca\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.843176 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.843154 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-tls\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.867932 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.867911 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-wl6xg\"" Apr 16 23:28:41.876579 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.876558 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qhz5v" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.944627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a216be1-f2fc-4496-b025-c083bf935ba0-root\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.944665 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-wtmp\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.944708 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.944735 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8pjn\" (UniqueName: \"kubernetes.io/projected/4a216be1-f2fc-4496-b025-c083bf935ba0-kube-api-access-l8pjn\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.944770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a216be1-f2fc-4496-b025-c083bf935ba0-sys\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.944809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.944839 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-textfile\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.944865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a216be1-f2fc-4496-b025-c083bf935ba0-metrics-client-ca\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.944904 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-tls\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.945693 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a216be1-f2fc-4496-b025-c083bf935ba0-root\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.946296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.945824 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-wtmp\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.948271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.947554 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-accelerators-collector-config\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.948271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.947627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a216be1-f2fc-4496-b025-c083bf935ba0-sys\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.948271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.947838 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-textfile\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.948271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.948212 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a216be1-f2fc-4496-b025-c083bf935ba0-metrics-client-ca\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.951793 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.951740 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-tls\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.953277 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.953229 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a216be1-f2fc-4496-b025-c083bf935ba0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:41.954042 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:41.954021 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8pjn\" (UniqueName: \"kubernetes.io/projected/4a216be1-f2fc-4496-b025-c083bf935ba0-kube-api-access-l8pjn\") pod \"node-exporter-7h258\" (UID: \"4a216be1-f2fc-4496-b025-c083bf935ba0\") " pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:42.006329 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.006301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7h258" Apr 16 23:28:42.010820 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.010793 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qhz5v"] Apr 16 23:28:42.016688 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:42.016257 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a216be1_f2fc_4496_b025_c083bf935ba0.slice/crio-045e32f48d618ef0df73647bd310db458093354c54862c3231dceeffe4d87a88 WatchSource:0}: Error finding container 045e32f48d618ef0df73647bd310db458093354c54862c3231dceeffe4d87a88: Status 404 returned error can't find the container with id 045e32f48d618ef0df73647bd310db458093354c54862c3231dceeffe4d87a88 Apr 16 23:28:42.018752 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:42.018646 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ed66159_86cf_4f43_824b_3905a5019c1c.slice/crio-e42bfac19275014e0d3bf922479249d1b8f24208606b23ec9fb4326af94f6359 WatchSource:0}: Error finding container e42bfac19275014e0d3bf922479249d1b8f24208606b23ec9fb4326af94f6359: Status 404 returned error can't find the container with id e42bfac19275014e0d3bf922479249d1b8f24208606b23ec9fb4326af94f6359 Apr 16 23:28:42.430436 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.430403 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7h258" event={"ID":"4a216be1-f2fc-4496-b025-c083bf935ba0","Type":"ContainerStarted","Data":"045e32f48d618ef0df73647bd310db458093354c54862c3231dceeffe4d87a88"} Apr 16 23:28:42.431491 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.431461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qhz5v" event={"ID":"3ed66159-86cf-4f43-824b-3905a5019c1c","Type":"ContainerStarted","Data":"e42bfac19275014e0d3bf922479249d1b8f24208606b23ec9fb4326af94f6359"} Apr 16 23:28:42.768400 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.768157 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:28:42.772519 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.772498 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.775239 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.775216 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 23:28:42.775724 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.775701 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 23:28:42.775940 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.775921 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 23:28:42.776131 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.776107 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 23:28:42.776316 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.776301 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 23:28:42.776495 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.776481 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 23:28:42.776883 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.776863 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 23:28:42.777085 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.777047 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 23:28:42.777085 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.777065 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5q2t6\"" Apr 16 23:28:42.777304 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.777288 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 23:28:42.789660 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.789633 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:28:42.953138 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953138 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953139 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953365 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-web-config\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953365 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953216 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953365 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953249 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953365 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953279 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqsl\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-kube-api-access-zkqsl\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953365 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953306 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953365 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953335 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953815 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953815 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953394 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953815 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953427 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953815 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953455 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:42.953815 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:42.953488 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-out\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.054029 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.053952 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.054882 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.054006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-out\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056167 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056403 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056266 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-web-config\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056470 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056405 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056470 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056438 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056609 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056552 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqsl\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-kube-api-access-zkqsl\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056666 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056746 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056649 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056804 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056857 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.056907 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.056892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.059667 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.058472 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.059667 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.059289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.062616 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.062592 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-out\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.063411 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.063220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.067130 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.066665 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-web-config\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.067332 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.067214 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.068653 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.067888 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.068653 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.068103 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.068653 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.068389 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.068653 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.068395 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.072968 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.072869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.073617 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.073574 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-volume\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.073995 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.073947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqsl\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-kube-api-access-zkqsl\") pod \"alertmanager-main-0\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.086640 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.086618 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:28:43.230056 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:28:43.230019 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a216be1_f2fc_4496_b025_c083bf935ba0.slice/crio-conmon-43f4eb547e5884167b4a711aecec03c99e1092a7fa6aaf9c13645a5b59f17620.scope\": RecentStats: unable to find data in memory cache]" Apr 16 23:28:43.237501 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.237475 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:28:43.406411 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:43.406350 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8630dcce_64e8_4324_bc68_af0b8d38d9b3.slice/crio-215524d96274868237c82aaee2cd0d47194ea50a689b5809198a1d2cbf09f07b WatchSource:0}: Error finding container 215524d96274868237c82aaee2cd0d47194ea50a689b5809198a1d2cbf09f07b: Status 404 returned error can't find the container with id 215524d96274868237c82aaee2cd0d47194ea50a689b5809198a1d2cbf09f07b Apr 16 23:28:43.435876 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.435847 2573 generic.go:358] "Generic (PLEG): container finished" podID="4a216be1-f2fc-4496-b025-c083bf935ba0" containerID="43f4eb547e5884167b4a711aecec03c99e1092a7fa6aaf9c13645a5b59f17620" exitCode=0 Apr 16 23:28:43.435998 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.435932 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7h258" event={"ID":"4a216be1-f2fc-4496-b025-c083bf935ba0","Type":"ContainerDied","Data":"43f4eb547e5884167b4a711aecec03c99e1092a7fa6aaf9c13645a5b59f17620"} Apr 16 23:28:43.437078 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:43.437056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerStarted","Data":"215524d96274868237c82aaee2cd0d47194ea50a689b5809198a1d2cbf09f07b"} Apr 16 23:28:44.441350 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.441324 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7h258" event={"ID":"4a216be1-f2fc-4496-b025-c083bf935ba0","Type":"ContainerStarted","Data":"1b184664ab3dae26407752b6269acbe17901fa033d31fb88fa12de5af15b0405"} Apr 16 23:28:44.441682 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.441361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7h258" event={"ID":"4a216be1-f2fc-4496-b025-c083bf935ba0","Type":"ContainerStarted","Data":"8c4de64a85b523197a782eaca0835c9a0ce8f7b6785066ca5423bb09d211c7a3"} Apr 16 23:28:44.442893 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.442868 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qhz5v" event={"ID":"3ed66159-86cf-4f43-824b-3905a5019c1c","Type":"ContainerStarted","Data":"84318cff4c578d7e175239eea75c69e99bce9e6e2731b8a809f21a59ec6e73fa"} Apr 16 23:28:44.443003 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.442901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qhz5v" event={"ID":"3ed66159-86cf-4f43-824b-3905a5019c1c","Type":"ContainerStarted","Data":"e5ec5ca3d9a6b35974137505680cc105a48633f2ece1f7174bcd1c15f54ba925"} Apr 16 23:28:44.444227 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.444205 2573 generic.go:358] "Generic (PLEG): container finished" podID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerID="ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515" exitCode=0 Apr 16 23:28:44.444311 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.444277 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerDied","Data":"ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515"} Apr 16 23:28:44.458944 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.458899 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7h258" podStartSLOduration=2.436046061 podStartE2EDuration="3.458886604s" podCreationTimestamp="2026-04-16 23:28:41 +0000 UTC" firstStartedPulling="2026-04-16 23:28:42.018976616 +0000 UTC m=+129.620895695" lastFinishedPulling="2026-04-16 23:28:43.04181715 +0000 UTC m=+130.643736238" observedRunningTime="2026-04-16 23:28:44.457807296 +0000 UTC m=+132.059726397" watchObservedRunningTime="2026-04-16 23:28:44.458886604 +0000 UTC m=+132.060805704" Apr 16 23:28:44.496649 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.496572 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qhz5v" podStartSLOduration=130.064539476 podStartE2EDuration="2m11.496558818s" podCreationTimestamp="2026-04-16 23:26:33 +0000 UTC" firstStartedPulling="2026-04-16 23:28:42.020293655 +0000 UTC m=+129.622212738" lastFinishedPulling="2026-04-16 23:28:43.452312996 +0000 UTC m=+131.054232080" observedRunningTime="2026-04-16 23:28:44.472997719 +0000 UTC m=+132.074916821" watchObservedRunningTime="2026-04-16 23:28:44.496558818 +0000 UTC m=+132.098477920" Apr 16 23:28:44.673102 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.673040 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7"] Apr 16 23:28:44.677055 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.677036 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.679116 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.679097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-e7777bkr75rv4\"" Apr 16 23:28:44.679231 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.679204 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 23:28:44.679470 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.679454 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 23:28:44.679571 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.679471 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 23:28:44.679644 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.679595 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-lxqdp\"" Apr 16 23:28:44.679848 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.679832 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 23:28:44.679927 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.679902 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 23:28:44.689031 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.689014 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7"] Apr 16 23:28:44.772913 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.772878 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-tls\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.773011 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.772915 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.773011 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.772944 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1511bfed-c802-44ba-95c8-e41c1ab89d60-metrics-client-ca\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.773011 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.772986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.773111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.773033 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.773111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.773049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-grpc-tls\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.773111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.773066 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44bhq\" (UniqueName: \"kubernetes.io/projected/1511bfed-c802-44ba-95c8-e41c1ab89d60-kube-api-access-44bhq\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.773111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.773089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.874206 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.874178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1511bfed-c802-44ba-95c8-e41c1ab89d60-metrics-client-ca\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.874321 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.874214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.874321 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.874258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.874321 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.874274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-grpc-tls\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.874321 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.874292 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44bhq\" (UniqueName: \"kubernetes.io/projected/1511bfed-c802-44ba-95c8-e41c1ab89d60-kube-api-access-44bhq\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.874321 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.874315 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.874581 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.874443 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-tls\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.874581 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.874477 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.874980 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.874926 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1511bfed-c802-44ba-95c8-e41c1ab89d60-metrics-client-ca\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.877207 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.877162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.877445 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.877426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-grpc-tls\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.877685 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.877660 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.877929 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.877910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-tls\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.877987 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.877945 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.878031 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.877984 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1511bfed-c802-44ba-95c8-e41c1ab89d60-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.881630 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.881606 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44bhq\" (UniqueName: \"kubernetes.io/projected/1511bfed-c802-44ba-95c8-e41c1ab89d60-kube-api-access-44bhq\") pod \"thanos-querier-5fc7b44fd6-kfxd7\" (UID: \"1511bfed-c802-44ba-95c8-e41c1ab89d60\") " pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:44.989955 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:44.989935 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:45.105766 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:45.105735 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7"] Apr 16 23:28:45.108506 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:45.108479 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1511bfed_c802_44ba_95c8_e41c1ab89d60.slice/crio-2a58cc932f16d552d35f46bb73456ba1f689b7edb627d19ef98eb9dd70bfd553 WatchSource:0}: Error finding container 2a58cc932f16d552d35f46bb73456ba1f689b7edb627d19ef98eb9dd70bfd553: Status 404 returned error can't find the container with id 2a58cc932f16d552d35f46bb73456ba1f689b7edb627d19ef98eb9dd70bfd553 Apr 16 23:28:45.447448 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:45.447402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" event={"ID":"1511bfed-c802-44ba-95c8-e41c1ab89d60","Type":"ContainerStarted","Data":"2a58cc932f16d552d35f46bb73456ba1f689b7edb627d19ef98eb9dd70bfd553"} Apr 16 23:28:46.453292 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.453239 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerStarted","Data":"e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f"} Apr 16 23:28:46.453985 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.453961 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl"] Apr 16 23:28:46.457358 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.457340 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" Apr 16 23:28:46.459254 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.459230 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 23:28:46.459360 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.459300 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-sh5jd\"" Apr 16 23:28:46.463898 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.463863 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl"] Apr 16 23:28:46.589253 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.589172 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f9f37f0-a72b-46b9-8ebb-6751c4d3d1da-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-krbwl\" (UID: \"5f9f37f0-a72b-46b9-8ebb-6751c4d3d1da\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" Apr 16 23:28:46.689901 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.689869 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f9f37f0-a72b-46b9-8ebb-6751c4d3d1da-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-krbwl\" (UID: \"5f9f37f0-a72b-46b9-8ebb-6751c4d3d1da\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" Apr 16 23:28:46.692776 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.692754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5f9f37f0-a72b-46b9-8ebb-6751c4d3d1da-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-krbwl\" (UID: \"5f9f37f0-a72b-46b9-8ebb-6751c4d3d1da\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" Apr 16 23:28:46.769781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.769749 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" Apr 16 23:28:46.994719 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:46.994678 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl"] Apr 16 23:28:46.997123 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:46.997089 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9f37f0_a72b_46b9_8ebb_6751c4d3d1da.slice/crio-866eb6226202326dd672ad343dc27c2c964121606551a85b4d8cb815814e6bce WatchSource:0}: Error finding container 866eb6226202326dd672ad343dc27c2c964121606551a85b4d8cb815814e6bce: Status 404 returned error can't find the container with id 866eb6226202326dd672ad343dc27c2c964121606551a85b4d8cb815814e6bce Apr 16 23:28:47.458454 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.458424 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerStarted","Data":"1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f"} Apr 16 23:28:47.458852 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.458461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerStarted","Data":"a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf"} Apr 16 23:28:47.458852 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.458471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerStarted","Data":"192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122"} Apr 16 23:28:47.458852 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.458479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerStarted","Data":"a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da"} Apr 16 23:28:47.460173 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.460152 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" event={"ID":"1511bfed-c802-44ba-95c8-e41c1ab89d60","Type":"ContainerStarted","Data":"b649e2f8eac25da2eb43819ce187187c0a388f6afb1e5cc6369bc3bef221fecc"} Apr 16 23:28:47.460248 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.460184 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" event={"ID":"1511bfed-c802-44ba-95c8-e41c1ab89d60","Type":"ContainerStarted","Data":"27cf661b5642ef53a12486469ea4dc7dbae218e438fd3a7100ac76854f4f0e04"} Apr 16 23:28:47.460248 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.460197 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" event={"ID":"1511bfed-c802-44ba-95c8-e41c1ab89d60","Type":"ContainerStarted","Data":"9298025a978edfff94b9318627ed2e6b73706ca35fe913050ea183daba66e3ac"} Apr 16 23:28:47.461111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.461088 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" event={"ID":"5f9f37f0-a72b-46b9-8ebb-6751c4d3d1da","Type":"ContainerStarted","Data":"866eb6226202326dd672ad343dc27c2c964121606551a85b4d8cb815814e6bce"} Apr 16 23:28:47.885352 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.885281 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:28:47.888678 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.888655 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:47.891151 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.891120 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 23:28:47.891151 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.891147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 23:28:47.891344 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.891147 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 23:28:47.891441 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.891424 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 23:28:47.891441 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.891438 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 23:28:47.891573 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.891451 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 23:28:47.891573 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.891489 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 23:28:47.891850 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.891828 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 23:28:47.891918 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.891862 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 23:28:47.892117 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.892097 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 23:28:47.892302 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.892166 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-w2lnx\"" Apr 16 23:28:47.892302 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.892185 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7sd0ko30jut1s\"" Apr 16 23:28:47.892434 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.892423 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 23:28:47.892563 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.892457 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 23:28:47.894666 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.894644 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 23:28:47.905044 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:47.905019 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:28:48.000042 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000014 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000042 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000114 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000151 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-config-out\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000244 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000221 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000429 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000252 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000429 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000429 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000353 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000429 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000395 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-527p9\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-kube-api-access-527p9\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000609 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000430 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-web-config\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000609 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000456 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000609 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000481 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000609 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000609 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-config\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000806 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000634 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000806 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000676 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.000806 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.000697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.101604 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101565 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.101763 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101608 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.101763 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.101763 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101680 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-527p9\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-kube-api-access-527p9\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.101763 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101713 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-web-config\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.101763 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102006 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102006 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101802 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102006 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101833 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-config\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102006 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101859 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102006 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102006 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102006 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.101988 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102324 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.102045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102324 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.102086 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102324 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.102117 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-config-out\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102324 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.102147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102324 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.102176 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.102627 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.102605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.103232 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.103205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.105104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.104675 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-web-config\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.105104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.104806 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-config\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.105104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.104869 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.105104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.105073 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.105104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.105090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.106132 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.105489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.106132 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.105524 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.106132 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.105948 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.106132 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.106097 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.106390 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.106224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.107421 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.107355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.108001 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.107901 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-config-out\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.108715 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.108682 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.109054 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.108991 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.109563 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.109502 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.110319 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.110299 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-527p9\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-kube-api-access-527p9\") pod \"prometheus-k8s-0\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.200782 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.200751 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:48.370598 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.370573 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:28:48.374204 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:28:48.374172 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28ee3c24_b647_46de_9386_3a6ce76ed47c.slice/crio-29df712728343e94398009c32255d836c59d719294b0609796a1d10778163a7c WatchSource:0}: Error finding container 29df712728343e94398009c32255d836c59d719294b0609796a1d10778163a7c: Status 404 returned error can't find the container with id 29df712728343e94398009c32255d836c59d719294b0609796a1d10778163a7c Apr 16 23:28:48.467436 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.467327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerStarted","Data":"a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6"} Apr 16 23:28:48.470405 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.470376 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" event={"ID":"1511bfed-c802-44ba-95c8-e41c1ab89d60","Type":"ContainerStarted","Data":"daa5bdfc993c5785403924a9f57b9ae4952964a3c28f173e03e460e847588c82"} Apr 16 23:28:48.470481 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.470417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" event={"ID":"1511bfed-c802-44ba-95c8-e41c1ab89d60","Type":"ContainerStarted","Data":"f206970f508832fad5ac7d52f61d8d116d7147d10860b1fa802af181697fcda1"} Apr 16 23:28:48.471925 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.471902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" event={"ID":"5f9f37f0-a72b-46b9-8ebb-6751c4d3d1da","Type":"ContainerStarted","Data":"4bc653144c5431403b4d89ae84acb8f7f5e3fd37720658b2933672344874c5df"} Apr 16 23:28:48.472384 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.472367 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" Apr 16 23:28:48.473753 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.473726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerStarted","Data":"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782"} Apr 16 23:28:48.473826 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.473761 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerStarted","Data":"29df712728343e94398009c32255d836c59d719294b0609796a1d10778163a7c"} Apr 16 23:28:48.477903 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.477870 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" Apr 16 23:28:48.491024 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.490987 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.6581095609999998 podStartE2EDuration="6.490974603s" podCreationTimestamp="2026-04-16 23:28:42 +0000 UTC" firstStartedPulling="2026-04-16 23:28:43.408721835 +0000 UTC m=+131.010640929" lastFinishedPulling="2026-04-16 23:28:48.241586889 +0000 UTC m=+135.843505971" observedRunningTime="2026-04-16 23:28:48.489094117 +0000 UTC m=+136.091013217" watchObservedRunningTime="2026-04-16 23:28:48.490974603 +0000 UTC m=+136.092893701" Apr 16 23:28:48.503080 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:48.503043 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-krbwl" podStartSLOduration=1.209376622 podStartE2EDuration="2.503032571s" podCreationTimestamp="2026-04-16 23:28:46 +0000 UTC" firstStartedPulling="2026-04-16 23:28:46.998758756 +0000 UTC m=+134.600677837" lastFinishedPulling="2026-04-16 23:28:48.292414706 +0000 UTC m=+135.894333786" observedRunningTime="2026-04-16 23:28:48.501628922 +0000 UTC m=+136.103548025" watchObservedRunningTime="2026-04-16 23:28:48.503032571 +0000 UTC m=+136.104951718" Apr 16 23:28:49.479370 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:49.479326 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" event={"ID":"1511bfed-c802-44ba-95c8-e41c1ab89d60","Type":"ContainerStarted","Data":"d0799153048311cd6985b90f498207bca83655b0156839d2559efca617986ea2"} Apr 16 23:28:49.479815 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:49.479506 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:49.480790 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:49.480761 2573 generic.go:358] "Generic (PLEG): container finished" podID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerID="9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782" exitCode=0 Apr 16 23:28:49.480931 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:49.480847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerDied","Data":"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782"} Apr 16 23:28:49.500399 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:49.500356 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" podStartSLOduration=2.369239196 podStartE2EDuration="5.500343157s" podCreationTimestamp="2026-04-16 23:28:44 +0000 UTC" firstStartedPulling="2026-04-16 23:28:45.110484091 +0000 UTC m=+132.712403184" lastFinishedPulling="2026-04-16 23:28:48.241588051 +0000 UTC m=+135.843507145" observedRunningTime="2026-04-16 23:28:49.499150721 +0000 UTC m=+137.101069839" watchObservedRunningTime="2026-04-16 23:28:49.500343157 +0000 UTC m=+137.102262257" Apr 16 23:28:52.493618 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:52.493583 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerStarted","Data":"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1"} Apr 16 23:28:52.493618 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:52.493619 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerStarted","Data":"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8"} Apr 16 23:28:52.494166 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:52.493630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerStarted","Data":"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf"} Apr 16 23:28:52.494166 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:52.493639 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerStarted","Data":"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d"} Apr 16 23:28:52.494166 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:52.493647 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerStarted","Data":"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d"} Apr 16 23:28:52.494166 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:52.493655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerStarted","Data":"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88"} Apr 16 23:28:52.518353 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:52.518304 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.105130283 podStartE2EDuration="5.51829235s" podCreationTimestamp="2026-04-16 23:28:47 +0000 UTC" firstStartedPulling="2026-04-16 23:28:49.482162052 +0000 UTC m=+137.084081131" lastFinishedPulling="2026-04-16 23:28:51.895324116 +0000 UTC m=+139.497243198" observedRunningTime="2026-04-16 23:28:52.516643286 +0000 UTC m=+140.118562387" watchObservedRunningTime="2026-04-16 23:28:52.51829235 +0000 UTC m=+140.120211451" Apr 16 23:28:53.201680 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:53.201647 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:28:54.250623 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:54.250587 2573 patch_prober.go:28] interesting pod/image-registry-7f59d69d8f-zflqg container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 23:28:54.250995 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:54.250640 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" podUID="b7bb1187-a715-4b7a-aa6e-0dc183dc753d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 23:28:55.491046 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:55.491015 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5fc7b44fd6-kfxd7" Apr 16 23:28:56.419634 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:28:56.419609 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7f59d69d8f-zflqg" Apr 16 23:29:08.288918 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:29:08.288871 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2cxtr" podUID="fbf7e722-63c3-4ad8-b126-d39966fa38f3" Apr 16 23:29:08.295384 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:29:08.295357 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-fdk7d" podUID="9ee4f424-f877-4116-acf8-e4a8c70b5329" Apr 16 23:29:08.539445 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:08.539375 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:29:08.539575 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:08.539382 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2cxtr" Apr 16 23:29:13.093137 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.093065 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:29:13.093137 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.093108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:29:13.095548 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.095509 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fbf7e722-63c3-4ad8-b126-d39966fa38f3-metrics-tls\") pod \"dns-default-2cxtr\" (UID: \"fbf7e722-63c3-4ad8-b126-d39966fa38f3\") " pod="openshift-dns/dns-default-2cxtr" Apr 16 23:29:13.095647 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.095590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9ee4f424-f877-4116-acf8-e4a8c70b5329-cert\") pod \"ingress-canary-fdk7d\" (UID: \"9ee4f424-f877-4116-acf8-e4a8c70b5329\") " pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:29:13.342242 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.342217 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-89jck\"" Apr 16 23:29:13.343035 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.343017 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m6886\"" Apr 16 23:29:13.350968 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.350916 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2cxtr" Apr 16 23:29:13.351056 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.351008 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fdk7d" Apr 16 23:29:13.479087 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.479062 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fdk7d"] Apr 16 23:29:13.481865 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:29:13.481833 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee4f424_f877_4116_acf8_e4a8c70b5329.slice/crio-bac751d9090e7c4f05903e9794efa5c74da73e28e4eaab6116340b8c9d8dbe7b WatchSource:0}: Error finding container bac751d9090e7c4f05903e9794efa5c74da73e28e4eaab6116340b8c9d8dbe7b: Status 404 returned error can't find the container with id bac751d9090e7c4f05903e9794efa5c74da73e28e4eaab6116340b8c9d8dbe7b Apr 16 23:29:13.503469 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.503447 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2cxtr"] Apr 16 23:29:13.505554 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:29:13.505513 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbf7e722_63c3_4ad8_b126_d39966fa38f3.slice/crio-5254ea57233a21611ebd660edd104c870f131d4dcf68bf026945f1dec8f90383 WatchSource:0}: Error finding container 5254ea57233a21611ebd660edd104c870f131d4dcf68bf026945f1dec8f90383: Status 404 returned error can't find the container with id 5254ea57233a21611ebd660edd104c870f131d4dcf68bf026945f1dec8f90383 Apr 16 23:29:13.555247 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.555218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fdk7d" event={"ID":"9ee4f424-f877-4116-acf8-e4a8c70b5329","Type":"ContainerStarted","Data":"bac751d9090e7c4f05903e9794efa5c74da73e28e4eaab6116340b8c9d8dbe7b"} Apr 16 23:29:13.556291 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:13.556270 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2cxtr" event={"ID":"fbf7e722-63c3-4ad8-b126-d39966fa38f3","Type":"ContainerStarted","Data":"5254ea57233a21611ebd660edd104c870f131d4dcf68bf026945f1dec8f90383"} Apr 16 23:29:15.564942 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:15.564854 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2cxtr" event={"ID":"fbf7e722-63c3-4ad8-b126-d39966fa38f3","Type":"ContainerStarted","Data":"3fb7dfe7050119b15495d18344ab179b8210b6584b9bcdfe10f832c44377e8ab"} Apr 16 23:29:15.566519 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:15.566494 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fdk7d" event={"ID":"9ee4f424-f877-4116-acf8-e4a8c70b5329","Type":"ContainerStarted","Data":"8f0583d7b43ca50bc903a028819d0c651bae3b7a015299acfc2b9e7d9b272188"} Apr 16 23:29:15.582054 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:15.581995 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fdk7d" podStartSLOduration=128.747024321 podStartE2EDuration="2m10.581979033s" podCreationTimestamp="2026-04-16 23:27:05 +0000 UTC" firstStartedPulling="2026-04-16 23:29:13.484388072 +0000 UTC m=+161.086307158" lastFinishedPulling="2026-04-16 23:29:15.319342788 +0000 UTC m=+162.921261870" observedRunningTime="2026-04-16 23:29:15.580810821 +0000 UTC m=+163.182729922" watchObservedRunningTime="2026-04-16 23:29:15.581979033 +0000 UTC m=+163.183898135" Apr 16 23:29:16.573552 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:16.573508 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2cxtr" event={"ID":"fbf7e722-63c3-4ad8-b126-d39966fa38f3","Type":"ContainerStarted","Data":"3b4875e8b593b93ce0598492ff915ddbcd050453972a90190d4880bf4e7e6251"} Apr 16 23:29:16.590162 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:16.590109 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2cxtr" podStartSLOduration=129.781630857 podStartE2EDuration="2m11.590094596s" podCreationTimestamp="2026-04-16 23:27:05 +0000 UTC" firstStartedPulling="2026-04-16 23:29:13.507217606 +0000 UTC m=+161.109136686" lastFinishedPulling="2026-04-16 23:29:15.31568134 +0000 UTC m=+162.917600425" observedRunningTime="2026-04-16 23:29:16.588115708 +0000 UTC m=+164.190034837" watchObservedRunningTime="2026-04-16 23:29:16.590094596 +0000 UTC m=+164.192013697" Apr 16 23:29:17.576476 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:17.576445 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2cxtr" Apr 16 23:29:23.594310 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:23.594264 2573 generic.go:358] "Generic (PLEG): container finished" podID="65533e30-6d62-4a07-b42f-3d2ebe34b87f" containerID="d5b1c6156f7b746eee080729cfad0d060b3db2ccfc5af285c220535b87ffe01e" exitCode=0 Apr 16 23:29:23.594741 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:23.594330 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" event={"ID":"65533e30-6d62-4a07-b42f-3d2ebe34b87f","Type":"ContainerDied","Data":"d5b1c6156f7b746eee080729cfad0d060b3db2ccfc5af285c220535b87ffe01e"} Apr 16 23:29:23.594741 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:23.594729 2573 scope.go:117] "RemoveContainer" containerID="d5b1c6156f7b746eee080729cfad0d060b3db2ccfc5af285c220535b87ffe01e" Apr 16 23:29:24.598451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:24.598419 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-g6wh4" event={"ID":"65533e30-6d62-4a07-b42f-3d2ebe34b87f","Type":"ContainerStarted","Data":"e8d3bdf765c38de760049a9244b4e403c593f00ee4ca703a6f0229176932a5c2"} Apr 16 23:29:27.582047 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:27.582014 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2cxtr" Apr 16 23:29:48.201239 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:48.201203 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:29:48.217516 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:48.217491 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:29:48.693912 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:29:48.693885 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:01.953404 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:01.953367 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:30:01.954053 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:01.953999 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="alertmanager" containerID="cri-o://e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f" gracePeriod=120 Apr 16 23:30:01.954192 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:01.954091 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy-web" containerID="cri-o://192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122" gracePeriod=120 Apr 16 23:30:01.954192 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:01.954092 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="prom-label-proxy" containerID="cri-o://a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6" gracePeriod=120 Apr 16 23:30:01.954192 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:01.954118 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy" containerID="cri-o://a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf" gracePeriod=120 Apr 16 23:30:01.954192 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:01.954063 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy-metric" containerID="cri-o://1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f" gracePeriod=120 Apr 16 23:30:01.954192 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:01.954135 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="config-reloader" containerID="cri-o://a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da" gracePeriod=120 Apr 16 23:30:02.721808 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721775 2573 generic.go:358] "Generic (PLEG): container finished" podID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerID="a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6" exitCode=0 Apr 16 23:30:02.721808 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721801 2573 generic.go:358] "Generic (PLEG): container finished" podID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerID="a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf" exitCode=0 Apr 16 23:30:02.721808 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721808 2573 generic.go:358] "Generic (PLEG): container finished" podID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerID="192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122" exitCode=0 Apr 16 23:30:02.721808 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721813 2573 generic.go:358] "Generic (PLEG): container finished" podID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerID="a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da" exitCode=0 Apr 16 23:30:02.722056 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721821 2573 generic.go:358] "Generic (PLEG): container finished" podID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerID="e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f" exitCode=0 Apr 16 23:30:02.722056 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerDied","Data":"a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6"} Apr 16 23:30:02.722056 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721887 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerDied","Data":"a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf"} Apr 16 23:30:02.722056 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721901 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerDied","Data":"192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122"} Apr 16 23:30:02.722056 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721914 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerDied","Data":"a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da"} Apr 16 23:30:02.722056 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:02.721925 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerDied","Data":"e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f"} Apr 16 23:30:03.181462 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.181441 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.368373 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368300 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-cluster-tls-config\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368373 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368334 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368373 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368364 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-volume\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368669 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368395 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-main-db\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368669 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368423 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkqsl\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-kube-api-access-zkqsl\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368669 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368457 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-web\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368669 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368496 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-tls-assets\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368669 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368558 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368669 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368595 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-out\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368669 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368619 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-main-tls\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.368669 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368662 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-metrics-client-ca\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.369076 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368700 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-web-config\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.369076 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368724 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-trusted-ca-bundle\") pod \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\" (UID: \"8630dcce-64e8-4324-bc68-af0b8d38d9b3\") " Apr 16 23:30:03.369076 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368800 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:30:03.369076 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.368982 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-main-db\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.369687 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.369623 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:03.369798 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.369699 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:03.372053 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.371788 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-kube-api-access-zkqsl" (OuterVolumeSpecName: "kube-api-access-zkqsl") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "kube-api-access-zkqsl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:30:03.372053 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.371899 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:30:03.372053 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.371921 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:03.372053 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.372018 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:03.372335 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.372175 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-volume" (OuterVolumeSpecName: "config-volume") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:03.372335 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.372281 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:03.372335 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.372300 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:03.372852 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.372835 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-out" (OuterVolumeSpecName: "config-out") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:30:03.375477 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.375453 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:03.382948 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.382928 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-web-config" (OuterVolumeSpecName: "web-config") pod "8630dcce-64e8-4324-bc68-af0b8d38d9b3" (UID: "8630dcce-64e8-4324-bc68-af0b8d38d9b3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:03.469320 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469287 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-cluster-tls-config\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469320 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469316 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469331 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-volume\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469344 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zkqsl\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-kube-api-access-zkqsl\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469358 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469371 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8630dcce-64e8-4324-bc68-af0b8d38d9b3-tls-assets\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469380 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469388 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8630dcce-64e8-4324-bc68-af0b8d38d9b3-config-out\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469396 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-secret-alertmanager-main-tls\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469407 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-metrics-client-ca\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469420 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8630dcce-64e8-4324-bc68-af0b8d38d9b3-web-config\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.469451 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.469434 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8630dcce-64e8-4324-bc68-af0b8d38d9b3-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:03.727146 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.727113 2573 generic.go:358] "Generic (PLEG): container finished" podID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerID="1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f" exitCode=0 Apr 16 23:30:03.727297 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.727190 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerDied","Data":"1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f"} Apr 16 23:30:03.727297 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.727226 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8630dcce-64e8-4324-bc68-af0b8d38d9b3","Type":"ContainerDied","Data":"215524d96274868237c82aaee2cd0d47194ea50a689b5809198a1d2cbf09f07b"} Apr 16 23:30:03.727297 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.727243 2573 scope.go:117] "RemoveContainer" containerID="a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6" Apr 16 23:30:03.727297 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.727252 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.735499 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.735480 2573 scope.go:117] "RemoveContainer" containerID="1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f" Apr 16 23:30:03.743687 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.743669 2573 scope.go:117] "RemoveContainer" containerID="a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf" Apr 16 23:30:03.749645 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.749622 2573 scope.go:117] "RemoveContainer" containerID="192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122" Apr 16 23:30:03.752879 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.752859 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:30:03.756190 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.756169 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:30:03.757013 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.756999 2573 scope.go:117] "RemoveContainer" containerID="a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da" Apr 16 23:30:03.763065 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.763050 2573 scope.go:117] "RemoveContainer" containerID="e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f" Apr 16 23:30:03.769275 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.769258 2573 scope.go:117] "RemoveContainer" containerID="ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515" Apr 16 23:30:03.775823 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.775807 2573 scope.go:117] "RemoveContainer" containerID="a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6" Apr 16 23:30:03.776068 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:03.776048 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6\": container with ID starting with a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6 not found: ID does not exist" containerID="a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6" Apr 16 23:30:03.776120 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.776074 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6"} err="failed to get container status \"a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6\": rpc error: code = NotFound desc = could not find container \"a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6\": container with ID starting with a45a1454648c807a8cee19688ae09ecf0edd5651ea4aaf042bd7cc4f449fd9e6 not found: ID does not exist" Apr 16 23:30:03.776120 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.776103 2573 scope.go:117] "RemoveContainer" containerID="1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f" Apr 16 23:30:03.776336 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:03.776317 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f\": container with ID starting with 1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f not found: ID does not exist" containerID="1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f" Apr 16 23:30:03.776379 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.776342 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f"} err="failed to get container status \"1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f\": rpc error: code = NotFound desc = could not find container \"1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f\": container with ID starting with 1a55c3d954163fb9041d2771a27df9e722f065067c4e715aa8356fc3b68cf11f not found: ID does not exist" Apr 16 23:30:03.776379 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.776357 2573 scope.go:117] "RemoveContainer" containerID="a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf" Apr 16 23:30:03.776592 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:03.776573 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf\": container with ID starting with a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf not found: ID does not exist" containerID="a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf" Apr 16 23:30:03.776677 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.776594 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf"} err="failed to get container status \"a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf\": rpc error: code = NotFound desc = could not find container \"a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf\": container with ID starting with a8d520c82efc92b9b278fae154ec63876459553c6ce825af0d46c6327d504bcf not found: ID does not exist" Apr 16 23:30:03.776677 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.776609 2573 scope.go:117] "RemoveContainer" containerID="192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122" Apr 16 23:30:03.776829 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:03.776814 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122\": container with ID starting with 192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122 not found: ID does not exist" containerID="192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122" Apr 16 23:30:03.776873 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.776834 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122"} err="failed to get container status \"192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122\": rpc error: code = NotFound desc = could not find container \"192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122\": container with ID starting with 192e50bb178d358c0fb1104aa1e552f5da20f6b33e2184f86548d6a6ee930122 not found: ID does not exist" Apr 16 23:30:03.776873 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.776848 2573 scope.go:117] "RemoveContainer" containerID="a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da" Apr 16 23:30:03.777075 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:03.777059 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da\": container with ID starting with a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da not found: ID does not exist" containerID="a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da" Apr 16 23:30:03.777136 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.777082 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da"} err="failed to get container status \"a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da\": rpc error: code = NotFound desc = could not find container \"a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da\": container with ID starting with a4701335469dfa5d715b15602a2cac7342b4e82d7f6cc8b6eadb247c767ce5da not found: ID does not exist" Apr 16 23:30:03.777136 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.777102 2573 scope.go:117] "RemoveContainer" containerID="e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f" Apr 16 23:30:03.777315 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:03.777298 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f\": container with ID starting with e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f not found: ID does not exist" containerID="e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f" Apr 16 23:30:03.777352 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.777320 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f"} err="failed to get container status \"e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f\": rpc error: code = NotFound desc = could not find container \"e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f\": container with ID starting with e1ab9884138f0afd42b86b42337c7ec3088cba2038e9208dbc3228d5d8b64f9f not found: ID does not exist" Apr 16 23:30:03.777352 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.777334 2573 scope.go:117] "RemoveContainer" containerID="ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515" Apr 16 23:30:03.777552 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:03.777523 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515\": container with ID starting with ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515 not found: ID does not exist" containerID="ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515" Apr 16 23:30:03.777606 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.777558 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515"} err="failed to get container status \"ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515\": rpc error: code = NotFound desc = could not find container \"ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515\": container with ID starting with ddfb0e3d438e51087e3d3f8dc53cd9251c41619d77fda4f6c8a1f073e3550515 not found: ID does not exist" Apr 16 23:30:03.780335 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780312 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:30:03.780695 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780679 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="config-reloader" Apr 16 23:30:03.780762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780697 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="config-reloader" Apr 16 23:30:03.780762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780709 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="init-config-reloader" Apr 16 23:30:03.780762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780715 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="init-config-reloader" Apr 16 23:30:03.780762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780720 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy-web" Apr 16 23:30:03.780762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780726 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy-web" Apr 16 23:30:03.780762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780733 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="alertmanager" Apr 16 23:30:03.780762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780738 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="alertmanager" Apr 16 23:30:03.780762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780748 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="prom-label-proxy" Apr 16 23:30:03.780762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780753 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="prom-label-proxy" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780773 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780782 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780798 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy-metric" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780804 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy-metric" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780854 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="config-reloader" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780864 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="prom-label-proxy" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780878 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780885 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy-web" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780894 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="kube-rbac-proxy-metric" Apr 16 23:30:03.781017 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.780900 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" containerName="alertmanager" Apr 16 23:30:03.785894 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.785878 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.787876 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.787859 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 23:30:03.787964 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.787861 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 23:30:03.788166 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.788149 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 23:30:03.788166 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.788158 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 23:30:03.788271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.788167 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 23:30:03.788271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.788200 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5q2t6\"" Apr 16 23:30:03.788271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.788222 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 23:30:03.788271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.788159 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 23:30:03.788271 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.788266 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 23:30:03.792754 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.792714 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 23:30:03.794905 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.794881 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:30:03.973860 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.973827 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.973996 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.973876 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-config-out\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.973996 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.973928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr888\" (UniqueName: \"kubernetes.io/projected/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-kube-api-access-wr888\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.973996 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.973959 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.973996 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.973983 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.974256 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.974015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.974256 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.974059 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.974256 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.974088 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.974256 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.974136 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.974256 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.974161 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-web-config\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.974256 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.974189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-config-volume\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.974256 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.974219 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:03.974256 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:03.974243 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075384 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075498 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-config-out\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075498 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wr888\" (UniqueName: \"kubernetes.io/projected/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-kube-api-access-wr888\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075498 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075720 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075687 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075777 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075726 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075777 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075894 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075894 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075829 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-web-config\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075894 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075855 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-config-volume\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.075894 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.076090 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075893 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.076090 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.075913 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.076663 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.076638 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.077295 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.077265 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.078465 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.078290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-config-out\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.078578 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.078505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.078760 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.078715 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.078903 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.078875 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.079087 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.079068 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-config-volume\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.079294 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.079273 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-web-config\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.079423 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.079400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.079792 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.079775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-tls-assets\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.080548 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.080510 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.082121 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.082101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr888\" (UniqueName: \"kubernetes.io/projected/ba9b01ef-0fef-42ae-8f54-792e6c3257fe-kube-api-access-wr888\") pod \"alertmanager-main-0\" (UID: \"ba9b01ef-0fef-42ae-8f54-792e6c3257fe\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.096717 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.096697 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 23:30:04.214930 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.214899 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 23:30:04.217940 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:30:04.217914 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba9b01ef_0fef_42ae_8f54_792e6c3257fe.slice/crio-e22cf78f6b2e93e4b211130335234349fd87fcf9cf92c337bb3a5eec8a4265ef WatchSource:0}: Error finding container e22cf78f6b2e93e4b211130335234349fd87fcf9cf92c337bb3a5eec8a4265ef: Status 404 returned error can't find the container with id e22cf78f6b2e93e4b211130335234349fd87fcf9cf92c337bb3a5eec8a4265ef Apr 16 23:30:04.732377 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.732339 2573 generic.go:358] "Generic (PLEG): container finished" podID="ba9b01ef-0fef-42ae-8f54-792e6c3257fe" containerID="81ece305bcf74bb063dc76db974a7b66be1af4359527e0bc5e2e8e58be87baeb" exitCode=0 Apr 16 23:30:04.732496 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.732388 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba9b01ef-0fef-42ae-8f54-792e6c3257fe","Type":"ContainerDied","Data":"81ece305bcf74bb063dc76db974a7b66be1af4359527e0bc5e2e8e58be87baeb"} Apr 16 23:30:04.732496 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:04.732416 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba9b01ef-0fef-42ae-8f54-792e6c3257fe","Type":"ContainerStarted","Data":"e22cf78f6b2e93e4b211130335234349fd87fcf9cf92c337bb3a5eec8a4265ef"} Apr 16 23:30:05.059076 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:05.059050 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8630dcce-64e8-4324-bc68-af0b8d38d9b3" path="/var/lib/kubelet/pods/8630dcce-64e8-4324-bc68-af0b8d38d9b3/volumes" Apr 16 23:30:05.737836 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:05.737801 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba9b01ef-0fef-42ae-8f54-792e6c3257fe","Type":"ContainerStarted","Data":"41d3cdb85df414dd5498eb1c23edf45e9f08e07a3d880d239182ce41e1e8d35c"} Apr 16 23:30:05.737836 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:05.737837 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba9b01ef-0fef-42ae-8f54-792e6c3257fe","Type":"ContainerStarted","Data":"00f14a20b62b1ef9a1a9efb70ff23ca90c8b21dd752e177a78b17382b0fc98cf"} Apr 16 23:30:05.738221 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:05.737847 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba9b01ef-0fef-42ae-8f54-792e6c3257fe","Type":"ContainerStarted","Data":"88c8267e73a4fd66e8a8ca2d16ad30b79c6cd4171f63b0c72786092e131dc78d"} Apr 16 23:30:05.738221 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:05.737856 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba9b01ef-0fef-42ae-8f54-792e6c3257fe","Type":"ContainerStarted","Data":"4eafd15bbaf0cd818934af830d2ed5edfc630fb288237ced257f7bbce8e3f43d"} Apr 16 23:30:05.738221 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:05.737864 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba9b01ef-0fef-42ae-8f54-792e6c3257fe","Type":"ContainerStarted","Data":"2a4c0becb4f8fe3d09e77ad1bfcef0994256f2a339fc55ffd11618105bd6d002"} Apr 16 23:30:05.738221 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:05.737872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"ba9b01ef-0fef-42ae-8f54-792e6c3257fe","Type":"ContainerStarted","Data":"7a12ca5a0ca0de52c8582e97cc66d6f220f73b66afbb83aa320c16d3259b5528"} Apr 16 23:30:05.764170 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:05.764128 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.764109573 podStartE2EDuration="2.764109573s" podCreationTimestamp="2026-04-16 23:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:30:05.76197767 +0000 UTC m=+213.363896798" watchObservedRunningTime="2026-04-16 23:30:05.764109573 +0000 UTC m=+213.366028675" Apr 16 23:30:06.165919 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.165832 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:30:06.166284 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.166254 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="prometheus" containerID="cri-o://9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" gracePeriod=600 Apr 16 23:30:06.166377 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.166307 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy-web" containerID="cri-o://11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" gracePeriod=600 Apr 16 23:30:06.166377 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.166299 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="thanos-sidecar" containerID="cri-o://b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" gracePeriod=600 Apr 16 23:30:06.166377 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.166339 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy-thanos" containerID="cri-o://2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" gracePeriod=600 Apr 16 23:30:06.166557 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.166332 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="config-reloader" containerID="cri-o://e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" gracePeriod=600 Apr 16 23:30:06.166640 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.166273 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy" containerID="cri-o://3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" gracePeriod=600 Apr 16 23:30:06.408224 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.408204 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.593995 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.593967 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-trusted-ca-bundle\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.593995 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.593998 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594021 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-527p9\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-kube-api-access-527p9\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594047 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-kube-rbac-proxy\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594075 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-db\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594110 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-serving-certs-ca-bundle\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594148 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594173 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-rulefiles-0\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594214 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594198 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-kubelet-serving-ca-bundle\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594583 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594524 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:06.594645 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594563 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:06.594645 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594623 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-grpc-tls\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594749 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594656 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-thanos-prometheus-http-client-file\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594802 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594762 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-config-out\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594802 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594789 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-metrics-client-ca\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594896 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594820 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-tls-assets\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594896 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594845 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-config\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.594896 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594873 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-web-config\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.595036 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594903 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:06.595036 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594926 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-metrics-client-certs\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.595036 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.594959 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-tls\") pod \"28ee3c24-b647-46de-9386-3a6ce76ed47c\" (UID: \"28ee3c24-b647-46de-9386-3a6ce76ed47c\") " Apr 16 23:30:06.595219 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.595201 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-trusted-ca-bundle\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.595278 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.595225 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.595278 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.595240 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.595498 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.595466 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:06.595937 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.595906 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:30:06.596467 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.596433 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:30:06.596862 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.596836 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-kube-api-access-527p9" (OuterVolumeSpecName: "kube-api-access-527p9") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "kube-api-access-527p9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:30:06.597245 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.597206 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:06.597492 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.597456 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:06.598368 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.598341 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:06.598459 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.598378 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-config-out" (OuterVolumeSpecName: "config-out") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:30:06.598459 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.598437 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-config" (OuterVolumeSpecName: "config") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:06.598647 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.598467 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:06.598647 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.598614 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:06.599353 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.599323 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:06.599598 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.599573 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:30:06.599819 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.599799 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:06.608272 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.608253 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-web-config" (OuterVolumeSpecName: "web-config") pod "28ee3c24-b647-46de-9386-3a6ce76ed47c" (UID: "28ee3c24-b647-46de-9386-3a6ce76ed47c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:30:06.696199 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696177 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-db\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696199 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696197 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696207 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696217 2573 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-grpc-tls\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696229 2573 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-thanos-prometheus-http-client-file\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696238 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28ee3c24-b647-46de-9386-3a6ce76ed47c-config-out\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696247 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28ee3c24-b647-46de-9386-3a6ce76ed47c-configmap-metrics-client-ca\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696256 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-tls-assets\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696264 2573 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-config\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696273 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-web-config\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696281 2573 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-metrics-client-certs\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696289 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-tls\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696298 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696306 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-527p9\" (UniqueName: \"kubernetes.io/projected/28ee3c24-b647-46de-9386-3a6ce76ed47c-kube-api-access-527p9\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.696669 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.696316 2573 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28ee3c24-b647-46de-9386-3a6ce76ed47c-secret-kube-rbac-proxy\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:30:06.746372 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746345 2573 generic.go:358] "Generic (PLEG): container finished" podID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerID="2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" exitCode=0 Apr 16 23:30:06.746372 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746370 2573 generic.go:358] "Generic (PLEG): container finished" podID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerID="3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" exitCode=0 Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746376 2573 generic.go:358] "Generic (PLEG): container finished" podID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerID="11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" exitCode=0 Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746383 2573 generic.go:358] "Generic (PLEG): container finished" podID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerID="b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" exitCode=0 Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746391 2573 generic.go:358] "Generic (PLEG): container finished" podID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerID="e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" exitCode=0 Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746397 2573 generic.go:358] "Generic (PLEG): container finished" podID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerID="9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" exitCode=0 Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746429 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerDied","Data":"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1"} Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746443 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746469 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerDied","Data":"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8"} Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746488 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerDied","Data":"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf"} Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746501 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerDied","Data":"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d"} Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerDied","Data":"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d"} Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746521 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerDied","Data":"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88"} Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746530 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28ee3c24-b647-46de-9386-3a6ce76ed47c","Type":"ContainerDied","Data":"29df712728343e94398009c32255d836c59d719294b0609796a1d10778163a7c"} Apr 16 23:30:06.746690 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.746559 2573 scope.go:117] "RemoveContainer" containerID="2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" Apr 16 23:30:06.753991 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.753921 2573 scope.go:117] "RemoveContainer" containerID="3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" Apr 16 23:30:06.760750 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.760734 2573 scope.go:117] "RemoveContainer" containerID="11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" Apr 16 23:30:06.767586 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.767524 2573 scope.go:117] "RemoveContainer" containerID="b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" Apr 16 23:30:06.768569 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.768510 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:30:06.771858 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.771838 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:30:06.774966 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.774950 2573 scope.go:117] "RemoveContainer" containerID="e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" Apr 16 23:30:06.780883 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.780868 2573 scope.go:117] "RemoveContainer" containerID="9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" Apr 16 23:30:06.787562 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.787530 2573 scope.go:117] "RemoveContainer" containerID="9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782" Apr 16 23:30:06.793555 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.793527 2573 scope.go:117] "RemoveContainer" containerID="2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" Apr 16 23:30:06.793800 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:06.793782 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": container with ID starting with 2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1 not found: ID does not exist" containerID="2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" Apr 16 23:30:06.793863 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.793810 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1"} err="failed to get container status \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": rpc error: code = NotFound desc = could not find container \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": container with ID starting with 2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1 not found: ID does not exist" Apr 16 23:30:06.793863 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.793835 2573 scope.go:117] "RemoveContainer" containerID="3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" Apr 16 23:30:06.794094 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:06.794077 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": container with ID starting with 3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8 not found: ID does not exist" containerID="3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" Apr 16 23:30:06.794129 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.794101 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8"} err="failed to get container status \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": rpc error: code = NotFound desc = could not find container \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": container with ID starting with 3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8 not found: ID does not exist" Apr 16 23:30:06.794129 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.794116 2573 scope.go:117] "RemoveContainer" containerID="11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" Apr 16 23:30:06.794315 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:06.794297 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": container with ID starting with 11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf not found: ID does not exist" containerID="11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" Apr 16 23:30:06.794377 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.794322 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf"} err="failed to get container status \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": rpc error: code = NotFound desc = could not find container \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": container with ID starting with 11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf not found: ID does not exist" Apr 16 23:30:06.794377 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.794342 2573 scope.go:117] "RemoveContainer" containerID="b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" Apr 16 23:30:06.794583 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:06.794567 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": container with ID starting with b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d not found: ID does not exist" containerID="b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" Apr 16 23:30:06.794630 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.794589 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d"} err="failed to get container status \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": rpc error: code = NotFound desc = could not find container \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": container with ID starting with b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d not found: ID does not exist" Apr 16 23:30:06.794630 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.794605 2573 scope.go:117] "RemoveContainer" containerID="e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" Apr 16 23:30:06.794834 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:06.794816 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": container with ID starting with e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d not found: ID does not exist" containerID="e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" Apr 16 23:30:06.794903 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.794840 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d"} err="failed to get container status \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": rpc error: code = NotFound desc = could not find container \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": container with ID starting with e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d not found: ID does not exist" Apr 16 23:30:06.794903 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.794859 2573 scope.go:117] "RemoveContainer" containerID="9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" Apr 16 23:30:06.795164 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:06.795048 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": container with ID starting with 9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88 not found: ID does not exist" containerID="9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" Apr 16 23:30:06.795164 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.795068 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88"} err="failed to get container status \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": rpc error: code = NotFound desc = could not find container \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": container with ID starting with 9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88 not found: ID does not exist" Apr 16 23:30:06.795164 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.795085 2573 scope.go:117] "RemoveContainer" containerID="9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782" Apr 16 23:30:06.795603 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:30:06.795577 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": container with ID starting with 9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782 not found: ID does not exist" containerID="9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782" Apr 16 23:30:06.795704 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.795607 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782"} err="failed to get container status \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": rpc error: code = NotFound desc = could not find container \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": container with ID starting with 9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782 not found: ID does not exist" Apr 16 23:30:06.795704 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.795627 2573 scope.go:117] "RemoveContainer" containerID="2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" Apr 16 23:30:06.795954 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.795874 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1"} err="failed to get container status \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": rpc error: code = NotFound desc = could not find container \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": container with ID starting with 2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1 not found: ID does not exist" Apr 16 23:30:06.795954 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.795902 2573 scope.go:117] "RemoveContainer" containerID="3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" Apr 16 23:30:06.796239 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.796219 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8"} err="failed to get container status \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": rpc error: code = NotFound desc = could not find container \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": container with ID starting with 3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8 not found: ID does not exist" Apr 16 23:30:06.796239 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.796238 2573 scope.go:117] "RemoveContainer" containerID="11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" Apr 16 23:30:06.796517 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.796495 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf"} err="failed to get container status \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": rpc error: code = NotFound desc = could not find container \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": container with ID starting with 11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf not found: ID does not exist" Apr 16 23:30:06.796615 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.796518 2573 scope.go:117] "RemoveContainer" containerID="b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" Apr 16 23:30:06.796789 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.796768 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d"} err="failed to get container status \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": rpc error: code = NotFound desc = could not find container \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": container with ID starting with b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d not found: ID does not exist" Apr 16 23:30:06.796857 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.796798 2573 scope.go:117] "RemoveContainer" containerID="e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" Apr 16 23:30:06.797012 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.796997 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:30:06.797071 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797032 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d"} err="failed to get container status \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": rpc error: code = NotFound desc = could not find container \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": container with ID starting with e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d not found: ID does not exist" Apr 16 23:30:06.797071 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797047 2573 scope.go:117] "RemoveContainer" containerID="9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" Apr 16 23:30:06.797285 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797260 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88"} err="failed to get container status \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": rpc error: code = NotFound desc = could not find container \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": container with ID starting with 9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88 not found: ID does not exist" Apr 16 23:30:06.797285 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797285 2573 scope.go:117] "RemoveContainer" containerID="9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797289 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="init-config-reloader" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797303 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="init-config-reloader" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797320 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="thanos-sidecar" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797326 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="thanos-sidecar" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797339 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="config-reloader" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797344 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="config-reloader" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797349 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy-web" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797355 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy-web" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797360 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797365 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797371 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy-thanos" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797379 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy-thanos" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797391 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="prometheus" Apr 16 23:30:06.797413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797398 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="prometheus" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797459 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy-web" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797471 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="prometheus" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797479 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="thanos-sidecar" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797489 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy-thanos" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797495 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="kube-rbac-proxy" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797501 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" containerName="config-reloader" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797518 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782"} err="failed to get container status \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": rpc error: code = NotFound desc = could not find container \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": container with ID starting with 9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782 not found: ID does not exist" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797549 2573 scope.go:117] "RemoveContainer" containerID="2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797779 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1"} err="failed to get container status \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": rpc error: code = NotFound desc = could not find container \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": container with ID starting with 2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1 not found: ID does not exist" Apr 16 23:30:06.797902 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797792 2573 scope.go:117] "RemoveContainer" containerID="3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" Apr 16 23:30:06.798249 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797957 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8"} err="failed to get container status \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": rpc error: code = NotFound desc = could not find container \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": container with ID starting with 3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8 not found: ID does not exist" Apr 16 23:30:06.798249 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.797972 2573 scope.go:117] "RemoveContainer" containerID="11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" Apr 16 23:30:06.798249 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798163 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf"} err="failed to get container status \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": rpc error: code = NotFound desc = could not find container \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": container with ID starting with 11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf not found: ID does not exist" Apr 16 23:30:06.798249 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798176 2573 scope.go:117] "RemoveContainer" containerID="b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" Apr 16 23:30:06.798427 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798340 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d"} err="failed to get container status \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": rpc error: code = NotFound desc = could not find container \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": container with ID starting with b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d not found: ID does not exist" Apr 16 23:30:06.798427 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798365 2573 scope.go:117] "RemoveContainer" containerID="e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" Apr 16 23:30:06.798578 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798562 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d"} err="failed to get container status \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": rpc error: code = NotFound desc = could not find container \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": container with ID starting with e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d not found: ID does not exist" Apr 16 23:30:06.798631 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798578 2573 scope.go:117] "RemoveContainer" containerID="9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" Apr 16 23:30:06.798757 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798731 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88"} err="failed to get container status \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": rpc error: code = NotFound desc = could not find container \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": container with ID starting with 9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88 not found: ID does not exist" Apr 16 23:30:06.798801 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798757 2573 scope.go:117] "RemoveContainer" containerID="9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782" Apr 16 23:30:06.798922 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798908 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782"} err="failed to get container status \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": rpc error: code = NotFound desc = could not find container \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": container with ID starting with 9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782 not found: ID does not exist" Apr 16 23:30:06.798969 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.798922 2573 scope.go:117] "RemoveContainer" containerID="2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" Apr 16 23:30:06.799076 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.799062 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1"} err="failed to get container status \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": rpc error: code = NotFound desc = could not find container \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": container with ID starting with 2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1 not found: ID does not exist" Apr 16 23:30:06.799123 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.799076 2573 scope.go:117] "RemoveContainer" containerID="3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" Apr 16 23:30:06.799290 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.799270 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8"} err="failed to get container status \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": rpc error: code = NotFound desc = could not find container \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": container with ID starting with 3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8 not found: ID does not exist" Apr 16 23:30:06.799290 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.799288 2573 scope.go:117] "RemoveContainer" containerID="11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" Apr 16 23:30:06.799511 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.799493 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf"} err="failed to get container status \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": rpc error: code = NotFound desc = could not find container \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": container with ID starting with 11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf not found: ID does not exist" Apr 16 23:30:06.799627 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.799513 2573 scope.go:117] "RemoveContainer" containerID="b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" Apr 16 23:30:06.799806 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.799782 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d"} err="failed to get container status \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": rpc error: code = NotFound desc = could not find container \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": container with ID starting with b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d not found: ID does not exist" Apr 16 23:30:06.799882 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.799806 2573 scope.go:117] "RemoveContainer" containerID="e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" Apr 16 23:30:06.800033 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800016 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d"} err="failed to get container status \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": rpc error: code = NotFound desc = could not find container \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": container with ID starting with e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d not found: ID does not exist" Apr 16 23:30:06.800102 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800036 2573 scope.go:117] "RemoveContainer" containerID="9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" Apr 16 23:30:06.800275 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800256 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88"} err="failed to get container status \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": rpc error: code = NotFound desc = could not find container \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": container with ID starting with 9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88 not found: ID does not exist" Apr 16 23:30:06.800321 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800276 2573 scope.go:117] "RemoveContainer" containerID="9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782" Apr 16 23:30:06.800467 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800449 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782"} err="failed to get container status \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": rpc error: code = NotFound desc = could not find container \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": container with ID starting with 9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782 not found: ID does not exist" Apr 16 23:30:06.800525 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800468 2573 scope.go:117] "RemoveContainer" containerID="2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" Apr 16 23:30:06.800681 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800665 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1"} err="failed to get container status \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": rpc error: code = NotFound desc = could not find container \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": container with ID starting with 2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1 not found: ID does not exist" Apr 16 23:30:06.800723 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800681 2573 scope.go:117] "RemoveContainer" containerID="3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" Apr 16 23:30:06.800871 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800850 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8"} err="failed to get container status \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": rpc error: code = NotFound desc = could not find container \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": container with ID starting with 3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8 not found: ID does not exist" Apr 16 23:30:06.800871 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.800869 2573 scope.go:117] "RemoveContainer" containerID="11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" Apr 16 23:30:06.801078 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801061 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf"} err="failed to get container status \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": rpc error: code = NotFound desc = could not find container \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": container with ID starting with 11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf not found: ID does not exist" Apr 16 23:30:06.801141 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801078 2573 scope.go:117] "RemoveContainer" containerID="b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" Apr 16 23:30:06.801279 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801260 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d"} err="failed to get container status \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": rpc error: code = NotFound desc = could not find container \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": container with ID starting with b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d not found: ID does not exist" Apr 16 23:30:06.801329 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801280 2573 scope.go:117] "RemoveContainer" containerID="e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" Apr 16 23:30:06.801488 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801473 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d"} err="failed to get container status \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": rpc error: code = NotFound desc = could not find container \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": container with ID starting with e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d not found: ID does not exist" Apr 16 23:30:06.801549 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801488 2573 scope.go:117] "RemoveContainer" containerID="9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" Apr 16 23:30:06.801673 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801657 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88"} err="failed to get container status \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": rpc error: code = NotFound desc = could not find container \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": container with ID starting with 9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88 not found: ID does not exist" Apr 16 23:30:06.801710 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801675 2573 scope.go:117] "RemoveContainer" containerID="9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782" Apr 16 23:30:06.801888 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801873 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782"} err="failed to get container status \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": rpc error: code = NotFound desc = could not find container \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": container with ID starting with 9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782 not found: ID does not exist" Apr 16 23:30:06.801925 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.801888 2573 scope.go:117] "RemoveContainer" containerID="2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1" Apr 16 23:30:06.802046 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802032 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1"} err="failed to get container status \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": rpc error: code = NotFound desc = could not find container \"2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1\": container with ID starting with 2504c62d559231222be550b653e69acf2571e66f5f0262f7b3cfd0325ed283a1 not found: ID does not exist" Apr 16 23:30:06.802046 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802045 2573 scope.go:117] "RemoveContainer" containerID="3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8" Apr 16 23:30:06.802205 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802189 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8"} err="failed to get container status \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": rpc error: code = NotFound desc = could not find container \"3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8\": container with ID starting with 3b566c6d5c940dadc725e25f6a46772d1751601bdce667083e0d584851dcafe8 not found: ID does not exist" Apr 16 23:30:06.802241 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802205 2573 scope.go:117] "RemoveContainer" containerID="11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf" Apr 16 23:30:06.802400 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802384 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf"} err="failed to get container status \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": rpc error: code = NotFound desc = could not find container \"11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf\": container with ID starting with 11bfc785ad500a11d13a1575efb16f35cf7fb99cda702e32c8c1268c39a23dbf not found: ID does not exist" Apr 16 23:30:06.802446 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802399 2573 scope.go:117] "RemoveContainer" containerID="b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d" Apr 16 23:30:06.802588 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802572 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d"} err="failed to get container status \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": rpc error: code = NotFound desc = could not find container \"b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d\": container with ID starting with b8e4d83c4fef1af25801537a58d3991ee0334627755f96ee1a6e2df6fc7b7f8d not found: ID does not exist" Apr 16 23:30:06.802626 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802588 2573 scope.go:117] "RemoveContainer" containerID="e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d" Apr 16 23:30:06.802755 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802740 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d"} err="failed to get container status \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": rpc error: code = NotFound desc = could not find container \"e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d\": container with ID starting with e4c8022327f0a9c80da6f4c08510fb905b9a0e6382346ddafe36db0b3f0d737d not found: ID does not exist" Apr 16 23:30:06.802791 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802756 2573 scope.go:117] "RemoveContainer" containerID="9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88" Apr 16 23:30:06.802932 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802917 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.802966 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802934 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88"} err="failed to get container status \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": rpc error: code = NotFound desc = could not find container \"9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88\": container with ID starting with 9d57e38b53074337bb8b2f6114e5cf8b63d65a43b84d780d4936aed3020d2c88 not found: ID does not exist" Apr 16 23:30:06.802966 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.802945 2573 scope.go:117] "RemoveContainer" containerID="9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782" Apr 16 23:30:06.803553 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.803163 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782"} err="failed to get container status \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": rpc error: code = NotFound desc = could not find container \"9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782\": container with ID starting with 9b776da7707a824eeb8fff13f4ab1b4a3ab5d5bd63d81d6e308d09fb1c9ba782 not found: ID does not exist" Apr 16 23:30:06.805251 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805235 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 23:30:06.805485 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805285 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-w2lnx\"" Apr 16 23:30:06.805485 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805337 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 23:30:06.805485 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805350 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 23:30:06.805485 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805361 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 23:30:06.805485 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805419 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 23:30:06.805695 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805510 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 23:30:06.805771 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805753 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 23:30:06.805771 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805766 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 23:30:06.805865 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805761 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-7sd0ko30jut1s\"" Apr 16 23:30:06.805865 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.805812 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 23:30:06.806712 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.806451 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 23:30:06.806712 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.806575 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 23:30:06.813358 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.811634 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 23:30:06.814401 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.814049 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 23:30:06.816032 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.816011 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:30:06.897213 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897152 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897213 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897206 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897343 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897245 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897438 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897419 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-config\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897557 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897515 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b9bd0ade-c387-412e-8587-c2fa7e09c914-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897625 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897575 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l7ll\" (UniqueName: \"kubernetes.io/projected/b9bd0ade-c387-412e-8587-c2fa7e09c914-kube-api-access-6l7ll\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897661 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897704 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897679 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897745 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897708 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-web-config\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897781 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897770 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897855 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897855 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897817 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897855 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897852 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897988 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897871 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9bd0ade-c387-412e-8587-c2fa7e09c914-config-out\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897988 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897897 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9bd0ade-c387-412e-8587-c2fa7e09c914-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.897988 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897978 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.898137 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.897995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.998997 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.998973 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999112 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999112 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999028 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999226 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999139 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999226 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999187 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9bd0ade-c387-412e-8587-c2fa7e09c914-config-out\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999325 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999234 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9bd0ade-c387-412e-8587-c2fa7e09c914-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999325 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999325 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999457 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999457 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999361 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999457 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999384 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999457 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999429 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-config\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999679 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b9bd0ade-c387-412e-8587-c2fa7e09c914-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999679 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l7ll\" (UniqueName: \"kubernetes.io/projected/b9bd0ade-c387-412e-8587-c2fa7e09c914-kube-api-access-6l7ll\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999679 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999679 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999566 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999679 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999595 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999679 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999638 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-web-config\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999956 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999798 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:06.999956 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:06.999836 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.000119 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.000092 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b9bd0ade-c387-412e-8587-c2fa7e09c914-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.001503 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.001476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b9bd0ade-c387-412e-8587-c2fa7e09c914-config-out\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.001970 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.001946 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b9bd0ade-c387-412e-8587-c2fa7e09c914-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.002153 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.002130 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.002291 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.002273 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-web-config\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.002587 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.002563 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.002879 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.002845 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.003126 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.003102 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.003228 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.003207 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.003970 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.003943 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.004095 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.004051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.004095 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.004051 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.004624 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.004605 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.005094 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.005079 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b9bd0ade-c387-412e-8587-c2fa7e09c914-config\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.005156 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.005139 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b9bd0ade-c387-412e-8587-c2fa7e09c914-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.009754 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.009733 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l7ll\" (UniqueName: \"kubernetes.io/projected/b9bd0ade-c387-412e-8587-c2fa7e09c914-kube-api-access-6l7ll\") pod \"prometheus-k8s-0\" (UID: \"b9bd0ade-c387-412e-8587-c2fa7e09c914\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.058627 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.058604 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ee3c24-b647-46de-9386-3a6ce76ed47c" path="/var/lib/kubelet/pods/28ee3c24-b647-46de-9386-3a6ce76ed47c/volumes" Apr 16 23:30:07.117634 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.117611 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:07.268360 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.268258 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 23:30:07.272296 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:30:07.272266 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9bd0ade_c387_412e_8587_c2fa7e09c914.slice/crio-f48e1e80f06b0d12095167ee480598e65d8b82beb6278b02c9690f1b300fe987 WatchSource:0}: Error finding container f48e1e80f06b0d12095167ee480598e65d8b82beb6278b02c9690f1b300fe987: Status 404 returned error can't find the container with id f48e1e80f06b0d12095167ee480598e65d8b82beb6278b02c9690f1b300fe987 Apr 16 23:30:07.751756 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.751725 2573 generic.go:358] "Generic (PLEG): container finished" podID="b9bd0ade-c387-412e-8587-c2fa7e09c914" containerID="7aa9590b3b7ea0c75ecb94b1f0042b98926c55463edb30c1dc5594c84a339620" exitCode=0 Apr 16 23:30:07.752112 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.751818 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9bd0ade-c387-412e-8587-c2fa7e09c914","Type":"ContainerDied","Data":"7aa9590b3b7ea0c75ecb94b1f0042b98926c55463edb30c1dc5594c84a339620"} Apr 16 23:30:07.752112 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:07.751862 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9bd0ade-c387-412e-8587-c2fa7e09c914","Type":"ContainerStarted","Data":"f48e1e80f06b0d12095167ee480598e65d8b82beb6278b02c9690f1b300fe987"} Apr 16 23:30:08.760108 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:08.760070 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9bd0ade-c387-412e-8587-c2fa7e09c914","Type":"ContainerStarted","Data":"47de9b18cace83ba99223c3a8fb434ddfaae4b53884bef420af1f7aec587b548"} Apr 16 23:30:08.760108 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:08.760111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9bd0ade-c387-412e-8587-c2fa7e09c914","Type":"ContainerStarted","Data":"142b28de789d8b470cba77fbfeabf2b5412b5eb0d0243a0e57b3ef12f06de718"} Apr 16 23:30:08.760480 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:08.760120 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9bd0ade-c387-412e-8587-c2fa7e09c914","Type":"ContainerStarted","Data":"d5524c2348d42419dd390824e5bc62c86d1d0db07f958e71d7137e2627b1aef0"} Apr 16 23:30:08.760480 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:08.760131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9bd0ade-c387-412e-8587-c2fa7e09c914","Type":"ContainerStarted","Data":"c04229ad9cc6a46e71480fcba27d104166101545f6dee88c7adf05a499522e5c"} Apr 16 23:30:08.760480 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:08.760140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9bd0ade-c387-412e-8587-c2fa7e09c914","Type":"ContainerStarted","Data":"de1c6881ba97bebc209ad02ccf3df86911fe655753aebe402b02c1afb1505fdb"} Apr 16 23:30:08.760480 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:08.760153 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b9bd0ade-c387-412e-8587-c2fa7e09c914","Type":"ContainerStarted","Data":"34e19baebfe6b0687e00f030867e17dc0c17af7c6816deb92fab7012792f335e"} Apr 16 23:30:08.790411 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:08.786349 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.786330063 podStartE2EDuration="2.786330063s" podCreationTimestamp="2026-04-16 23:30:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:30:08.783443583 +0000 UTC m=+216.385362683" watchObservedRunningTime="2026-04-16 23:30:08.786330063 +0000 UTC m=+216.388249166" Apr 16 23:30:12.118654 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:12.118618 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:30:46.957190 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:46.957116 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bfwst"] Apr 16 23:30:46.960402 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:46.960386 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:46.962447 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:46.962427 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 23:30:46.966388 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:46.966366 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bfwst"] Apr 16 23:30:47.094224 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.094196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/18bc468b-4161-4117-b3e2-0607b32b04f4-dbus\") pod \"global-pull-secret-syncer-bfwst\" (UID: \"18bc468b-4161-4117-b3e2-0607b32b04f4\") " pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.094367 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.094238 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/18bc468b-4161-4117-b3e2-0607b32b04f4-kubelet-config\") pod \"global-pull-secret-syncer-bfwst\" (UID: \"18bc468b-4161-4117-b3e2-0607b32b04f4\") " pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.094367 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.094261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/18bc468b-4161-4117-b3e2-0607b32b04f4-original-pull-secret\") pod \"global-pull-secret-syncer-bfwst\" (UID: \"18bc468b-4161-4117-b3e2-0607b32b04f4\") " pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.195401 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.195376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/18bc468b-4161-4117-b3e2-0607b32b04f4-dbus\") pod \"global-pull-secret-syncer-bfwst\" (UID: \"18bc468b-4161-4117-b3e2-0607b32b04f4\") " pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.195519 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.195420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/18bc468b-4161-4117-b3e2-0607b32b04f4-kubelet-config\") pod \"global-pull-secret-syncer-bfwst\" (UID: \"18bc468b-4161-4117-b3e2-0607b32b04f4\") " pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.195592 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.195559 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/18bc468b-4161-4117-b3e2-0607b32b04f4-dbus\") pod \"global-pull-secret-syncer-bfwst\" (UID: \"18bc468b-4161-4117-b3e2-0607b32b04f4\") " pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.195592 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.195556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/18bc468b-4161-4117-b3e2-0607b32b04f4-original-pull-secret\") pod \"global-pull-secret-syncer-bfwst\" (UID: \"18bc468b-4161-4117-b3e2-0607b32b04f4\") " pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.195687 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.195594 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/18bc468b-4161-4117-b3e2-0607b32b04f4-kubelet-config\") pod \"global-pull-secret-syncer-bfwst\" (UID: \"18bc468b-4161-4117-b3e2-0607b32b04f4\") " pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.197941 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.197915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/18bc468b-4161-4117-b3e2-0607b32b04f4-original-pull-secret\") pod \"global-pull-secret-syncer-bfwst\" (UID: \"18bc468b-4161-4117-b3e2-0607b32b04f4\") " pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.270049 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.269974 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bfwst" Apr 16 23:30:47.385423 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.385389 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bfwst"] Apr 16 23:30:47.388393 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:30:47.388364 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18bc468b_4161_4117_b3e2_0607b32b04f4.slice/crio-88d9f859076675ed8641ac64818364add1790237d716a0f9bdbda8a576b55041 WatchSource:0}: Error finding container 88d9f859076675ed8641ac64818364add1790237d716a0f9bdbda8a576b55041: Status 404 returned error can't find the container with id 88d9f859076675ed8641ac64818364add1790237d716a0f9bdbda8a576b55041 Apr 16 23:30:47.885614 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:47.885567 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bfwst" event={"ID":"18bc468b-4161-4117-b3e2-0607b32b04f4","Type":"ContainerStarted","Data":"88d9f859076675ed8641ac64818364add1790237d716a0f9bdbda8a576b55041"} Apr 16 23:30:51.899531 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:51.899488 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bfwst" event={"ID":"18bc468b-4161-4117-b3e2-0607b32b04f4","Type":"ContainerStarted","Data":"ba24be4c1311bd3ed619cd11ef03ac1044b4d22ec1a7c055b1e4a6ff9ae94051"} Apr 16 23:30:51.913077 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:30:51.913030 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bfwst" podStartSLOduration=2.151732364 podStartE2EDuration="5.913016117s" podCreationTimestamp="2026-04-16 23:30:46 +0000 UTC" firstStartedPulling="2026-04-16 23:30:47.39013503 +0000 UTC m=+254.992054110" lastFinishedPulling="2026-04-16 23:30:51.15141878 +0000 UTC m=+258.753337863" observedRunningTime="2026-04-16 23:30:51.911959291 +0000 UTC m=+259.513878394" watchObservedRunningTime="2026-04-16 23:30:51.913016117 +0000 UTC m=+259.514935263" Apr 16 23:31:07.118731 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:31:07.118695 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:31:07.134917 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:31:07.134894 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:31:07.962396 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:31:07.962371 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 23:31:32.942263 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:31:32.942240 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 23:32:08.211174 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.211143 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq"] Apr 16 23:32:08.213196 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.213181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.216667 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.216641 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-lk878\"" Apr 16 23:32:08.217427 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.217400 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 23:32:08.217569 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.217479 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc"] Apr 16 23:32:08.218355 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.218328 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 23:32:08.218355 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.218352 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 23:32:08.218561 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.218357 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 23:32:08.218561 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.218524 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 23:32:08.219679 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.219661 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.222685 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.222665 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 23:32:08.222791 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.222670 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 23:32:08.222791 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.222691 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 23:32:08.222791 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.222717 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 23:32:08.223014 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.222999 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-v6gnw\"" Apr 16 23:32:08.229198 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.229177 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq"] Apr 16 23:32:08.242728 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.242703 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc"] Apr 16 23:32:08.288963 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.288940 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2bb5cfe-3a91-4560-b161-1a47586b8cae-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-w9qtc\" (UID: \"c2bb5cfe-3a91-4560-b161-1a47586b8cae\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.289053 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.288984 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0df32e9-0dba-4bd0-a534-af2d3a867627-cert\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.289053 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.289008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2bb5cfe-3a91-4560-b161-1a47586b8cae-webhook-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-w9qtc\" (UID: \"c2bb5cfe-3a91-4560-b161-1a47586b8cae\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.289123 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.289053 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e0df32e9-0dba-4bd0-a534-af2d3a867627-manager-config\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.289123 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.289082 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e0df32e9-0dba-4bd0-a534-af2d3a867627-kube-api-access-xbqp8\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.289188 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.289124 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0df32e9-0dba-4bd0-a534-af2d3a867627-metrics-cert\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.289188 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.289146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56tq\" (UniqueName: \"kubernetes.io/projected/c2bb5cfe-3a91-4560-b161-1a47586b8cae-kube-api-access-d56tq\") pod \"opendatahub-operator-controller-manager-8bf69b96d-w9qtc\" (UID: \"c2bb5cfe-3a91-4560-b161-1a47586b8cae\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.389685 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.389661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0df32e9-0dba-4bd0-a534-af2d3a867627-metrics-cert\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.389772 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.389692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d56tq\" (UniqueName: \"kubernetes.io/projected/c2bb5cfe-3a91-4560-b161-1a47586b8cae-kube-api-access-d56tq\") pod \"opendatahub-operator-controller-manager-8bf69b96d-w9qtc\" (UID: \"c2bb5cfe-3a91-4560-b161-1a47586b8cae\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.389772 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.389716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2bb5cfe-3a91-4560-b161-1a47586b8cae-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-w9qtc\" (UID: \"c2bb5cfe-3a91-4560-b161-1a47586b8cae\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.389905 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.389885 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0df32e9-0dba-4bd0-a534-af2d3a867627-cert\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.389963 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.389929 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2bb5cfe-3a91-4560-b161-1a47586b8cae-webhook-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-w9qtc\" (UID: \"c2bb5cfe-3a91-4560-b161-1a47586b8cae\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.390015 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.389969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e0df32e9-0dba-4bd0-a534-af2d3a867627-manager-config\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.390015 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.390006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e0df32e9-0dba-4bd0-a534-af2d3a867627-kube-api-access-xbqp8\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.390701 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.390678 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/e0df32e9-0dba-4bd0-a534-af2d3a867627-manager-config\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.392468 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.392444 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2bb5cfe-3a91-4560-b161-1a47586b8cae-webhook-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-w9qtc\" (UID: \"c2bb5cfe-3a91-4560-b161-1a47586b8cae\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.392468 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.392461 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e0df32e9-0dba-4bd0-a534-af2d3a867627-cert\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.392647 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.392485 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2bb5cfe-3a91-4560-b161-1a47586b8cae-apiservice-cert\") pod \"opendatahub-operator-controller-manager-8bf69b96d-w9qtc\" (UID: \"c2bb5cfe-3a91-4560-b161-1a47586b8cae\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.392647 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.392560 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0df32e9-0dba-4bd0-a534-af2d3a867627-metrics-cert\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.415237 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.415214 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e0df32e9-0dba-4bd0-a534-af2d3a867627-kube-api-access-xbqp8\") pod \"lws-controller-manager-86bf875fd5-mkdfq\" (UID: \"e0df32e9-0dba-4bd0-a534-af2d3a867627\") " pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.415336 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.415260 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56tq\" (UniqueName: \"kubernetes.io/projected/c2bb5cfe-3a91-4560-b161-1a47586b8cae-kube-api-access-d56tq\") pod \"opendatahub-operator-controller-manager-8bf69b96d-w9qtc\" (UID: \"c2bb5cfe-3a91-4560-b161-1a47586b8cae\") " pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.523763 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.523708 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:08.531460 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.531438 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:08.658646 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.658615 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq"] Apr 16 23:32:08.662932 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:32:08.662902 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0df32e9_0dba_4bd0_a534_af2d3a867627.slice/crio-1799602e83fc96b27c25bef71416680c3ff93ee3aa98464bc5bf979e35a166ae WatchSource:0}: Error finding container 1799602e83fc96b27c25bef71416680c3ff93ee3aa98464bc5bf979e35a166ae: Status 404 returned error can't find the container with id 1799602e83fc96b27c25bef71416680c3ff93ee3aa98464bc5bf979e35a166ae Apr 16 23:32:08.664773 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.664754 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:32:08.674767 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:08.674747 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc"] Apr 16 23:32:08.677769 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:32:08.677749 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2bb5cfe_3a91_4560_b161_1a47586b8cae.slice/crio-11532ddcb23320fc9de258b48e69cca470f9ff83d9856824e7fb0440c0f7929b WatchSource:0}: Error finding container 11532ddcb23320fc9de258b48e69cca470f9ff83d9856824e7fb0440c0f7929b: Status 404 returned error can't find the container with id 11532ddcb23320fc9de258b48e69cca470f9ff83d9856824e7fb0440c0f7929b Apr 16 23:32:09.128000 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:09.127970 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" event={"ID":"c2bb5cfe-3a91-4560-b161-1a47586b8cae","Type":"ContainerStarted","Data":"11532ddcb23320fc9de258b48e69cca470f9ff83d9856824e7fb0440c0f7929b"} Apr 16 23:32:09.129013 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:09.128990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" event={"ID":"e0df32e9-0dba-4bd0-a534-af2d3a867627","Type":"ContainerStarted","Data":"1799602e83fc96b27c25bef71416680c3ff93ee3aa98464bc5bf979e35a166ae"} Apr 16 23:32:13.145032 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:13.144999 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" event={"ID":"c2bb5cfe-3a91-4560-b161-1a47586b8cae","Type":"ContainerStarted","Data":"33b3878684e026109a4aa3456f6426e60602052b9bd6c129d17ab5a111cf3934"} Apr 16 23:32:13.145447 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:13.145056 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:13.146280 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:13.146259 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" event={"ID":"e0df32e9-0dba-4bd0-a534-af2d3a867627","Type":"ContainerStarted","Data":"0c41d54ad2ae41eeeef75ce80eca491efa861c2380c61f1be7c8fbab9459e237"} Apr 16 23:32:13.146385 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:13.146367 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:13.164681 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:13.164641 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" podStartSLOduration=1.6422557599999998 podStartE2EDuration="5.164631134s" podCreationTimestamp="2026-04-16 23:32:08 +0000 UTC" firstStartedPulling="2026-04-16 23:32:08.679009905 +0000 UTC m=+336.280928984" lastFinishedPulling="2026-04-16 23:32:12.201385276 +0000 UTC m=+339.803304358" observedRunningTime="2026-04-16 23:32:13.163146122 +0000 UTC m=+340.765065233" watchObservedRunningTime="2026-04-16 23:32:13.164631134 +0000 UTC m=+340.766550235" Apr 16 23:32:13.180699 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:13.180655 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" podStartSLOduration=1.6129098800000001 podStartE2EDuration="5.180640621s" podCreationTimestamp="2026-04-16 23:32:08 +0000 UTC" firstStartedPulling="2026-04-16 23:32:08.664924855 +0000 UTC m=+336.266843934" lastFinishedPulling="2026-04-16 23:32:12.232655579 +0000 UTC m=+339.834574675" observedRunningTime="2026-04-16 23:32:13.179877292 +0000 UTC m=+340.781796405" watchObservedRunningTime="2026-04-16 23:32:13.180640621 +0000 UTC m=+340.782559722" Apr 16 23:32:24.152684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:24.152649 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-86bf875fd5-mkdfq" Apr 16 23:32:24.153157 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:24.152824 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-8bf69b96d-w9qtc" Apr 16 23:32:57.199784 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.199741 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk"] Apr 16 23:32:57.203915 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.203890 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.206282 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.206250 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-vwcvn\"" Apr 16 23:32:57.206404 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.206384 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 23:32:57.212867 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.212836 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk"] Apr 16 23:32:57.349729 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.349703 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.349858 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.349737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cbd01c30-104e-4334-bb60-ecf20077118e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.349858 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.349766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cbd01c30-104e-4334-bb60-ecf20077118e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.349858 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.349784 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sjx5\" (UniqueName: \"kubernetes.io/projected/cbd01c30-104e-4334-bb60-ecf20077118e-kube-api-access-9sjx5\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.349983 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.349887 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.349983 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.349921 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.349983 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.349945 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cbd01c30-104e-4334-bb60-ecf20077118e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.350081 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.349992 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.350081 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.350011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.450848 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.450774 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.450848 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.450809 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451051 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.450854 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451051 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.450873 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cbd01c30-104e-4334-bb60-ecf20077118e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451051 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.450900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cbd01c30-104e-4334-bb60-ecf20077118e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451051 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.450918 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sjx5\" (UniqueName: \"kubernetes.io/projected/cbd01c30-104e-4334-bb60-ecf20077118e-kube-api-access-9sjx5\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451051 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.450948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451051 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.450971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451399 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.451188 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cbd01c30-104e-4334-bb60-ecf20077118e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451399 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.451202 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451399 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.451257 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451399 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.451296 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451528 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.451424 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.451891 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.451868 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/cbd01c30-104e-4334-bb60-ecf20077118e-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.453180 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.453153 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/cbd01c30-104e-4334-bb60-ecf20077118e-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.453287 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.453271 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/cbd01c30-104e-4334-bb60-ecf20077118e-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.457768 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.457744 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/cbd01c30-104e-4334-bb60-ecf20077118e-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.458327 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.458306 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sjx5\" (UniqueName: \"kubernetes.io/projected/cbd01c30-104e-4334-bb60-ecf20077118e-kube-api-access-9sjx5\") pod \"data-science-gateway-data-science-gateway-class-55cc67557f2sllk\" (UID: \"cbd01c30-104e-4334-bb60-ecf20077118e\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.516904 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.516881 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:32:57.634072 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:57.634044 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk"] Apr 16 23:32:57.636194 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:32:57.636168 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbd01c30_104e_4334_bb60_ecf20077118e.slice/crio-b74a6a3832b25c4687f30dfbed0fd9fed11b4d0af301cf560d9c38a6be1a0312 WatchSource:0}: Error finding container b74a6a3832b25c4687f30dfbed0fd9fed11b4d0af301cf560d9c38a6be1a0312: Status 404 returned error can't find the container with id b74a6a3832b25c4687f30dfbed0fd9fed11b4d0af301cf560d9c38a6be1a0312 Apr 16 23:32:58.298410 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:58.298366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" event={"ID":"cbd01c30-104e-4334-bb60-ecf20077118e","Type":"ContainerStarted","Data":"b74a6a3832b25c4687f30dfbed0fd9fed11b4d0af301cf560d9c38a6be1a0312"} Apr 16 23:32:59.959762 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:59.959725 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 23:32:59.959982 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:59.959802 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 23:32:59.959982 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:32:59.959829 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 23:33:00.305916 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:00.305842 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" event={"ID":"cbd01c30-104e-4334-bb60-ecf20077118e","Type":"ContainerStarted","Data":"8e0dedf205d3541d52cb6304ba1a23bc4af33a5152a743ce0ac8d407fedba2cf"} Apr 16 23:33:00.325904 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:00.325857 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" podStartSLOduration=1.004564122 podStartE2EDuration="3.325845194s" podCreationTimestamp="2026-04-16 23:32:57 +0000 UTC" firstStartedPulling="2026-04-16 23:32:57.638192567 +0000 UTC m=+385.240111661" lastFinishedPulling="2026-04-16 23:32:59.959473649 +0000 UTC m=+387.561392733" observedRunningTime="2026-04-16 23:33:00.323522096 +0000 UTC m=+387.925441210" watchObservedRunningTime="2026-04-16 23:33:00.325845194 +0000 UTC m=+387.927764294" Apr 16 23:33:00.517131 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:00.517103 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:33:01.521966 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:01.521941 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:33:02.317084 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:02.317041 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:33:02.318297 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:02.318273 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557f2sllk" Apr 16 23:33:27.058802 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.058766 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-277jv"] Apr 16 23:33:27.062090 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.062073 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-277jv" Apr 16 23:33:27.064275 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.064252 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-cdkmm\"" Apr 16 23:33:27.064487 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.064470 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 23:33:27.065105 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.065084 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 23:33:27.068418 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.068396 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-277jv"] Apr 16 23:33:27.165009 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.164973 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8g7n\" (UniqueName: \"kubernetes.io/projected/9b4a05c1-16ce-4168-b3fc-f8e4d320285d-kube-api-access-n8g7n\") pod \"kuadrant-operator-catalog-277jv\" (UID: \"9b4a05c1-16ce-4168-b3fc-f8e4d320285d\") " pod="kuadrant-system/kuadrant-operator-catalog-277jv" Apr 16 23:33:27.266307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.266277 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8g7n\" (UniqueName: \"kubernetes.io/projected/9b4a05c1-16ce-4168-b3fc-f8e4d320285d-kube-api-access-n8g7n\") pod \"kuadrant-operator-catalog-277jv\" (UID: \"9b4a05c1-16ce-4168-b3fc-f8e4d320285d\") " pod="kuadrant-system/kuadrant-operator-catalog-277jv" Apr 16 23:33:27.274314 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.274289 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8g7n\" (UniqueName: \"kubernetes.io/projected/9b4a05c1-16ce-4168-b3fc-f8e4d320285d-kube-api-access-n8g7n\") pod \"kuadrant-operator-catalog-277jv\" (UID: \"9b4a05c1-16ce-4168-b3fc-f8e4d320285d\") " pod="kuadrant-system/kuadrant-operator-catalog-277jv" Apr 16 23:33:27.372745 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.372684 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-277jv" Apr 16 23:33:27.428958 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.428927 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-277jv"] Apr 16 23:33:27.490107 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.490083 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-277jv"] Apr 16 23:33:27.492400 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:33:27.492375 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4a05c1_16ce_4168_b3fc_f8e4d320285d.slice/crio-a52365b57d9935e54c4090e4a468ad611628feed5efad57cb02da0950cdea277 WatchSource:0}: Error finding container a52365b57d9935e54c4090e4a468ad611628feed5efad57cb02da0950cdea277: Status 404 returned error can't find the container with id a52365b57d9935e54c4090e4a468ad611628feed5efad57cb02da0950cdea277 Apr 16 23:33:27.638661 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.638605 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dg642"] Apr 16 23:33:27.643435 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.643420 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dg642" Apr 16 23:33:27.648269 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.648244 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dg642"] Apr 16 23:33:27.669913 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.669891 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgv9v\" (UniqueName: \"kubernetes.io/projected/e089524b-e02c-46a3-adc8-2e96780cb78c-kube-api-access-hgv9v\") pod \"kuadrant-operator-catalog-dg642\" (UID: \"e089524b-e02c-46a3-adc8-2e96780cb78c\") " pod="kuadrant-system/kuadrant-operator-catalog-dg642" Apr 16 23:33:27.771127 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.771102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgv9v\" (UniqueName: \"kubernetes.io/projected/e089524b-e02c-46a3-adc8-2e96780cb78c-kube-api-access-hgv9v\") pod \"kuadrant-operator-catalog-dg642\" (UID: \"e089524b-e02c-46a3-adc8-2e96780cb78c\") " pod="kuadrant-system/kuadrant-operator-catalog-dg642" Apr 16 23:33:27.779869 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.779843 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgv9v\" (UniqueName: \"kubernetes.io/projected/e089524b-e02c-46a3-adc8-2e96780cb78c-kube-api-access-hgv9v\") pod \"kuadrant-operator-catalog-dg642\" (UID: \"e089524b-e02c-46a3-adc8-2e96780cb78c\") " pod="kuadrant-system/kuadrant-operator-catalog-dg642" Apr 16 23:33:27.953464 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:27.953442 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dg642" Apr 16 23:33:28.070938 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:28.070905 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dg642"] Apr 16 23:33:28.076320 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:33:28.076254 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode089524b_e02c_46a3_adc8_2e96780cb78c.slice/crio-9993a5205d283f9f5180d156de9e0a435c294f393a8fbb2b442b6456ad205d8e WatchSource:0}: Error finding container 9993a5205d283f9f5180d156de9e0a435c294f393a8fbb2b442b6456ad205d8e: Status 404 returned error can't find the container with id 9993a5205d283f9f5180d156de9e0a435c294f393a8fbb2b442b6456ad205d8e Apr 16 23:33:28.400752 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:28.400655 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dg642" event={"ID":"e089524b-e02c-46a3-adc8-2e96780cb78c","Type":"ContainerStarted","Data":"9993a5205d283f9f5180d156de9e0a435c294f393a8fbb2b442b6456ad205d8e"} Apr 16 23:33:28.401914 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:28.401869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-277jv" event={"ID":"9b4a05c1-16ce-4168-b3fc-f8e4d320285d","Type":"ContainerStarted","Data":"a52365b57d9935e54c4090e4a468ad611628feed5efad57cb02da0950cdea277"} Apr 16 23:33:30.410756 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:30.410717 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-277jv" event={"ID":"9b4a05c1-16ce-4168-b3fc-f8e4d320285d","Type":"ContainerStarted","Data":"1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b"} Apr 16 23:33:30.411251 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:30.410781 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-277jv" podUID="9b4a05c1-16ce-4168-b3fc-f8e4d320285d" containerName="registry-server" containerID="cri-o://1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b" gracePeriod=2 Apr 16 23:33:30.412324 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:30.412295 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dg642" event={"ID":"e089524b-e02c-46a3-adc8-2e96780cb78c","Type":"ContainerStarted","Data":"d5bc8f32b8c3f3480bdeaf6182ca1d5cc93e3482fbb36ce5d660a5a6c24b732e"} Apr 16 23:33:30.424379 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:30.424326 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-277jv" podStartSLOduration=1.296589272 podStartE2EDuration="3.424310122s" podCreationTimestamp="2026-04-16 23:33:27 +0000 UTC" firstStartedPulling="2026-04-16 23:33:27.493791013 +0000 UTC m=+415.095710095" lastFinishedPulling="2026-04-16 23:33:29.621511866 +0000 UTC m=+417.223430945" observedRunningTime="2026-04-16 23:33:30.423756812 +0000 UTC m=+418.025675919" watchObservedRunningTime="2026-04-16 23:33:30.424310122 +0000 UTC m=+418.026229224" Apr 16 23:33:30.436491 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:30.436442 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-dg642" podStartSLOduration=1.891780687 podStartE2EDuration="3.436430097s" podCreationTimestamp="2026-04-16 23:33:27 +0000 UTC" firstStartedPulling="2026-04-16 23:33:28.077288232 +0000 UTC m=+415.679207312" lastFinishedPulling="2026-04-16 23:33:29.62193763 +0000 UTC m=+417.223856722" observedRunningTime="2026-04-16 23:33:30.436215278 +0000 UTC m=+418.038134378" watchObservedRunningTime="2026-04-16 23:33:30.436430097 +0000 UTC m=+418.038349211" Apr 16 23:33:30.660264 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:30.660242 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-277jv" Apr 16 23:33:30.695396 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:30.695373 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8g7n\" (UniqueName: \"kubernetes.io/projected/9b4a05c1-16ce-4168-b3fc-f8e4d320285d-kube-api-access-n8g7n\") pod \"9b4a05c1-16ce-4168-b3fc-f8e4d320285d\" (UID: \"9b4a05c1-16ce-4168-b3fc-f8e4d320285d\") " Apr 16 23:33:30.697417 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:30.697389 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4a05c1-16ce-4168-b3fc-f8e4d320285d-kube-api-access-n8g7n" (OuterVolumeSpecName: "kube-api-access-n8g7n") pod "9b4a05c1-16ce-4168-b3fc-f8e4d320285d" (UID: "9b4a05c1-16ce-4168-b3fc-f8e4d320285d"). InnerVolumeSpecName "kube-api-access-n8g7n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:33:30.796859 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:30.796837 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n8g7n\" (UniqueName: \"kubernetes.io/projected/9b4a05c1-16ce-4168-b3fc-f8e4d320285d-kube-api-access-n8g7n\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:33:31.417091 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:31.417014 2573 generic.go:358] "Generic (PLEG): container finished" podID="9b4a05c1-16ce-4168-b3fc-f8e4d320285d" containerID="1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b" exitCode=0 Apr 16 23:33:31.417091 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:31.417077 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-277jv" Apr 16 23:33:31.417514 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:31.417111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-277jv" event={"ID":"9b4a05c1-16ce-4168-b3fc-f8e4d320285d","Type":"ContainerDied","Data":"1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b"} Apr 16 23:33:31.417514 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:31.417158 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-277jv" event={"ID":"9b4a05c1-16ce-4168-b3fc-f8e4d320285d","Type":"ContainerDied","Data":"a52365b57d9935e54c4090e4a468ad611628feed5efad57cb02da0950cdea277"} Apr 16 23:33:31.417514 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:31.417184 2573 scope.go:117] "RemoveContainer" containerID="1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b" Apr 16 23:33:31.425082 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:31.425067 2573 scope.go:117] "RemoveContainer" containerID="1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b" Apr 16 23:33:31.425320 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:33:31.425302 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b\": container with ID starting with 1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b not found: ID does not exist" containerID="1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b" Apr 16 23:33:31.425378 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:31.425328 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b"} err="failed to get container status \"1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b\": rpc error: code = NotFound desc = could not find container \"1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b\": container with ID starting with 1a214d83e9bb5e217165c42d3448c252825d7f00a8ad27fd219567b4919bee2b not found: ID does not exist" Apr 16 23:33:31.432075 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:31.432046 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-277jv"] Apr 16 23:33:31.434890 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:31.434868 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-277jv"] Apr 16 23:33:33.057648 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:33.057614 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4a05c1-16ce-4168-b3fc-f8e4d320285d" path="/var/lib/kubelet/pods/9b4a05c1-16ce-4168-b3fc-f8e4d320285d/volumes" Apr 16 23:33:37.954492 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:37.954463 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-dg642" Apr 16 23:33:37.954979 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:37.954514 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-dg642" Apr 16 23:33:37.976473 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:37.976440 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-dg642" Apr 16 23:33:38.462055 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:38.462029 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-dg642" Apr 16 23:33:58.697411 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.697378 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9"] Apr 16 23:33:58.697833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.697755 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b4a05c1-16ce-4168-b3fc-f8e4d320285d" containerName="registry-server" Apr 16 23:33:58.697833 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.697767 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4a05c1-16ce-4168-b3fc-f8e4d320285d" containerName="registry-server" Apr 16 23:33:58.697909 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.697837 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b4a05c1-16ce-4168-b3fc-f8e4d320285d" containerName="registry-server" Apr 16 23:33:58.700865 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.700850 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" Apr 16 23:33:58.703099 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.703081 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-k4zw5\"" Apr 16 23:33:58.712127 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.712102 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9"] Apr 16 23:33:58.808409 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.808385 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj688\" (UniqueName: \"kubernetes.io/projected/3173c993-c6b1-4497-8bc3-b186b9f09913-kube-api-access-lj688\") pod \"limitador-operator-controller-manager-85c4996f8c-95tz9\" (UID: \"3173c993-c6b1-4497-8bc3-b186b9f09913\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" Apr 16 23:33:58.909585 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.909531 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lj688\" (UniqueName: \"kubernetes.io/projected/3173c993-c6b1-4497-8bc3-b186b9f09913-kube-api-access-lj688\") pod \"limitador-operator-controller-manager-85c4996f8c-95tz9\" (UID: \"3173c993-c6b1-4497-8bc3-b186b9f09913\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" Apr 16 23:33:58.919622 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:58.919601 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj688\" (UniqueName: \"kubernetes.io/projected/3173c993-c6b1-4497-8bc3-b186b9f09913-kube-api-access-lj688\") pod \"limitador-operator-controller-manager-85c4996f8c-95tz9\" (UID: \"3173c993-c6b1-4497-8bc3-b186b9f09913\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" Apr 16 23:33:59.011741 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:59.011683 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" Apr 16 23:33:59.138417 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:59.138394 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9"] Apr 16 23:33:59.140703 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:33:59.140671 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3173c993_c6b1_4497_8bc3_b186b9f09913.slice/crio-ace9a938e2baee73774e049119c098340047014338e2d65dc2ecb665e33551a7 WatchSource:0}: Error finding container ace9a938e2baee73774e049119c098340047014338e2d65dc2ecb665e33551a7: Status 404 returned error can't find the container with id ace9a938e2baee73774e049119c098340047014338e2d65dc2ecb665e33551a7 Apr 16 23:33:59.509419 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:33:59.509387 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" event={"ID":"3173c993-c6b1-4497-8bc3-b186b9f09913","Type":"ContainerStarted","Data":"ace9a938e2baee73774e049119c098340047014338e2d65dc2ecb665e33551a7"} Apr 16 23:34:01.517403 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:01.517364 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" event={"ID":"3173c993-c6b1-4497-8bc3-b186b9f09913","Type":"ContainerStarted","Data":"816a27873bd26bc36df73cf8a76facdf90a61e9a439d5378bd2cc36cd2c1b096"} Apr 16 23:34:01.517403 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:01.517415 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" Apr 16 23:34:01.533739 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:01.533694 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" podStartSLOduration=1.8769009410000002 podStartE2EDuration="3.53368167s" podCreationTimestamp="2026-04-16 23:33:58 +0000 UTC" firstStartedPulling="2026-04-16 23:33:59.142842215 +0000 UTC m=+446.744761294" lastFinishedPulling="2026-04-16 23:34:00.799622941 +0000 UTC m=+448.401542023" observedRunningTime="2026-04-16 23:34:01.531369431 +0000 UTC m=+449.133288555" watchObservedRunningTime="2026-04-16 23:34:01.53368167 +0000 UTC m=+449.135600770" Apr 16 23:34:06.035009 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.034978 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm"] Apr 16 23:34:06.038455 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.038437 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" Apr 16 23:34:06.041061 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.041038 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-9pgq8\"" Apr 16 23:34:06.041892 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.041877 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 23:34:06.052156 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.052136 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm"] Apr 16 23:34:06.171176 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.171144 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svkm2\" (UniqueName: \"kubernetes.io/projected/233c7354-6628-431b-93c7-bbe1ff4897e2-kube-api-access-svkm2\") pod \"dns-operator-controller-manager-648d5c98bc-n2qcm\" (UID: \"233c7354-6628-431b-93c7-bbe1ff4897e2\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" Apr 16 23:34:06.272295 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.272267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svkm2\" (UniqueName: \"kubernetes.io/projected/233c7354-6628-431b-93c7-bbe1ff4897e2-kube-api-access-svkm2\") pod \"dns-operator-controller-manager-648d5c98bc-n2qcm\" (UID: \"233c7354-6628-431b-93c7-bbe1ff4897e2\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" Apr 16 23:34:06.279730 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.279711 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svkm2\" (UniqueName: \"kubernetes.io/projected/233c7354-6628-431b-93c7-bbe1ff4897e2-kube-api-access-svkm2\") pod \"dns-operator-controller-manager-648d5c98bc-n2qcm\" (UID: \"233c7354-6628-431b-93c7-bbe1ff4897e2\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" Apr 16 23:34:06.348819 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.348758 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" Apr 16 23:34:06.505389 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.505360 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm"] Apr 16 23:34:06.507414 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:34:06.507384 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod233c7354_6628_431b_93c7_bbe1ff4897e2.slice/crio-9040e92580ad37a1336d3724693d5dd49567833382a46ecadd346b486f19e7e3 WatchSource:0}: Error finding container 9040e92580ad37a1336d3724693d5dd49567833382a46ecadd346b486f19e7e3: Status 404 returned error can't find the container with id 9040e92580ad37a1336d3724693d5dd49567833382a46ecadd346b486f19e7e3 Apr 16 23:34:06.535834 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:06.535809 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" event={"ID":"233c7354-6628-431b-93c7-bbe1ff4897e2","Type":"ContainerStarted","Data":"9040e92580ad37a1336d3724693d5dd49567833382a46ecadd346b486f19e7e3"} Apr 16 23:34:09.547714 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:09.547679 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" event={"ID":"233c7354-6628-431b-93c7-bbe1ff4897e2","Type":"ContainerStarted","Data":"8740b649db5eb64659e7767065b9b0c5e449b85348c4f61f1554807bd148fb00"} Apr 16 23:34:09.548161 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:09.547773 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" Apr 16 23:34:09.566104 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:09.566054 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" podStartSLOduration=1.145378994 podStartE2EDuration="3.566039821s" podCreationTimestamp="2026-04-16 23:34:06 +0000 UTC" firstStartedPulling="2026-04-16 23:34:06.509386557 +0000 UTC m=+454.111305641" lastFinishedPulling="2026-04-16 23:34:08.930047374 +0000 UTC m=+456.531966468" observedRunningTime="2026-04-16 23:34:09.564107513 +0000 UTC m=+457.166026613" watchObservedRunningTime="2026-04-16 23:34:09.566039821 +0000 UTC m=+457.167958921" Apr 16 23:34:10.479726 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.479695 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf"] Apr 16 23:34:10.483233 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.483218 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:10.485668 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.485647 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-p2glc\"" Apr 16 23:34:10.495615 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.495596 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf"] Apr 16 23:34:10.608083 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.608057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" (UID: \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:10.608363 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.608097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2xms\" (UniqueName: \"kubernetes.io/projected/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-kube-api-access-k2xms\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" (UID: \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:10.709193 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.709166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xms\" (UniqueName: \"kubernetes.io/projected/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-kube-api-access-k2xms\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" (UID: \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:10.709289 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.709256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" (UID: \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:10.709730 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.709712 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" (UID: \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:10.717465 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.717441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2xms\" (UniqueName: \"kubernetes.io/projected/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-kube-api-access-k2xms\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" (UID: \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:10.793780 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.793724 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:10.922162 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:10.922140 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf"] Apr 16 23:34:10.924647 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:34:10.924620 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2471dbe9_4c99_4f27_92bb_4231ac85a4bf.slice/crio-75b81b434bcffe6a68df7f08eeba7afca4025d311fffff76ec4d31ae438ccf32 WatchSource:0}: Error finding container 75b81b434bcffe6a68df7f08eeba7afca4025d311fffff76ec4d31ae438ccf32: Status 404 returned error can't find the container with id 75b81b434bcffe6a68df7f08eeba7afca4025d311fffff76ec4d31ae438ccf32 Apr 16 23:34:11.555403 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:11.555359 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" event={"ID":"2471dbe9-4c99-4f27-92bb-4231ac85a4bf","Type":"ContainerStarted","Data":"75b81b434bcffe6a68df7f08eeba7afca4025d311fffff76ec4d31ae438ccf32"} Apr 16 23:34:12.523296 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:12.523260 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" Apr 16 23:34:16.574892 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:16.574852 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" event={"ID":"2471dbe9-4c99-4f27-92bb-4231ac85a4bf","Type":"ContainerStarted","Data":"7a803eea044471e49bfc3483572a41db079d2b7b1e42fb8f364d74b48abb60af"} Apr 16 23:34:16.575309 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:16.574910 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:16.594254 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:16.594205 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" podStartSLOduration=1.827745969 podStartE2EDuration="6.594191804s" podCreationTimestamp="2026-04-16 23:34:10 +0000 UTC" firstStartedPulling="2026-04-16 23:34:10.92701011 +0000 UTC m=+458.528929189" lastFinishedPulling="2026-04-16 23:34:15.693455946 +0000 UTC m=+463.295375024" observedRunningTime="2026-04-16 23:34:16.592277822 +0000 UTC m=+464.194196924" watchObservedRunningTime="2026-04-16 23:34:16.594191804 +0000 UTC m=+464.196110971" Apr 16 23:34:20.553304 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:20.553272 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-n2qcm" Apr 16 23:34:27.580356 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:27.580324 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:28.445943 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.445906 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf"] Apr 16 23:34:28.446190 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.446123 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" containerName="manager" containerID="cri-o://7a803eea044471e49bfc3483572a41db079d2b7b1e42fb8f364d74b48abb60af" gracePeriod=2 Apr 16 23:34:28.452785 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.452756 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf"] Apr 16 23:34:28.476981 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.476956 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9"] Apr 16 23:34:28.477272 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.477247 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" podUID="3173c993-c6b1-4497-8bc3-b186b9f09913" containerName="manager" containerID="cri-o://816a27873bd26bc36df73cf8a76facdf90a61e9a439d5378bd2cc36cd2c1b096" gracePeriod=2 Apr 16 23:34:28.479470 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.479439 2573 status_manager.go:895] "Failed to get status for pod" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" is forbidden: User \"system:node:ip-10-0-136-153.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-153.ec2.internal' and this object" Apr 16 23:34:28.491844 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.491818 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j"] Apr 16 23:34:28.492241 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.492224 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" containerName="manager" Apr 16 23:34:28.492315 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.492242 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" containerName="manager" Apr 16 23:34:28.492315 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.492314 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" containerName="manager" Apr 16 23:34:28.495358 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.495337 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9"] Apr 16 23:34:28.495476 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.495461 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:28.505657 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.505632 2573 status_manager.go:895] "Failed to get status for pod" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" is forbidden: User \"system:node:ip-10-0-136-153.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-153.ec2.internal' and this object" Apr 16 23:34:28.507681 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.507658 2573 status_manager.go:895] "Failed to get status for pod" podUID="3173c993-c6b1-4497-8bc3-b186b9f09913" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" err="pods \"limitador-operator-controller-manager-85c4996f8c-95tz9\" is forbidden: User \"system:node:ip-10-0-136-153.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-153.ec2.internal' and this object" Apr 16 23:34:28.526488 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.526451 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j"] Apr 16 23:34:28.528560 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.528521 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x"] Apr 16 23:34:28.528908 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.528891 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3173c993-c6b1-4497-8bc3-b186b9f09913" containerName="manager" Apr 16 23:34:28.528908 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.528908 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3173c993-c6b1-4497-8bc3-b186b9f09913" containerName="manager" Apr 16 23:34:28.529071 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.529037 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3173c993-c6b1-4497-8bc3-b186b9f09913" containerName="manager" Apr 16 23:34:28.532175 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.532156 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" Apr 16 23:34:28.548142 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.548119 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x"] Apr 16 23:34:28.561882 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.561858 2573 status_manager.go:895] "Failed to get status for pod" podUID="3173c993-c6b1-4497-8bc3-b186b9f09913" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" err="pods \"limitador-operator-controller-manager-85c4996f8c-95tz9\" is forbidden: User \"system:node:ip-10-0-136-153.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-153.ec2.internal' and this object" Apr 16 23:34:28.563996 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.563972 2573 status_manager.go:895] "Failed to get status for pod" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" is forbidden: User \"system:node:ip-10-0-136-153.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-153.ec2.internal' and this object" Apr 16 23:34:28.619058 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.619027 2573 generic.go:358] "Generic (PLEG): container finished" podID="3173c993-c6b1-4497-8bc3-b186b9f09913" containerID="816a27873bd26bc36df73cf8a76facdf90a61e9a439d5378bd2cc36cd2c1b096" exitCode=0 Apr 16 23:34:28.620745 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.620712 2573 generic.go:358] "Generic (PLEG): container finished" podID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" containerID="7a803eea044471e49bfc3483572a41db079d2b7b1e42fb8f364d74b48abb60af" exitCode=0 Apr 16 23:34:28.645982 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.645956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/18de8afb-4408-44de-84c3-f6c2f557063c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gk98j\" (UID: \"18de8afb-4408-44de-84c3-f6c2f557063c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:28.646078 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.646048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkj7b\" (UniqueName: \"kubernetes.io/projected/ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259-kube-api-access-qkj7b\") pod \"limitador-operator-controller-manager-85c4996f8c-kg69x\" (UID: \"ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" Apr 16 23:34:28.646128 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.646114 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9gvq\" (UniqueName: \"kubernetes.io/projected/18de8afb-4408-44de-84c3-f6c2f557063c-kube-api-access-m9gvq\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gk98j\" (UID: \"18de8afb-4408-44de-84c3-f6c2f557063c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:28.710172 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.710153 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:28.712483 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.712457 2573 status_manager.go:895] "Failed to get status for pod" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" is forbidden: User \"system:node:ip-10-0-136-153.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-153.ec2.internal' and this object" Apr 16 23:34:28.713326 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.713309 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" Apr 16 23:34:28.714597 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.714575 2573 status_manager.go:895] "Failed to get status for pod" podUID="3173c993-c6b1-4497-8bc3-b186b9f09913" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" err="pods \"limitador-operator-controller-manager-85c4996f8c-95tz9\" is forbidden: User \"system:node:ip-10-0-136-153.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-153.ec2.internal' and this object" Apr 16 23:34:28.716251 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.716234 2573 status_manager.go:895] "Failed to get status for pod" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-qpxbf\" is forbidden: User \"system:node:ip-10-0-136-153.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-153.ec2.internal' and this object" Apr 16 23:34:28.718074 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.718054 2573 status_manager.go:895] "Failed to get status for pod" podUID="3173c993-c6b1-4497-8bc3-b186b9f09913" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" err="pods \"limitador-operator-controller-manager-85c4996f8c-95tz9\" is forbidden: User \"system:node:ip-10-0-136-153.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-136-153.ec2.internal' and this object" Apr 16 23:34:28.746970 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.746948 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m9gvq\" (UniqueName: \"kubernetes.io/projected/18de8afb-4408-44de-84c3-f6c2f557063c-kube-api-access-m9gvq\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gk98j\" (UID: \"18de8afb-4408-44de-84c3-f6c2f557063c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:28.747036 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.747005 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/18de8afb-4408-44de-84c3-f6c2f557063c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gk98j\" (UID: \"18de8afb-4408-44de-84c3-f6c2f557063c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:28.747073 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.747048 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkj7b\" (UniqueName: \"kubernetes.io/projected/ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259-kube-api-access-qkj7b\") pod \"limitador-operator-controller-manager-85c4996f8c-kg69x\" (UID: \"ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" Apr 16 23:34:28.747365 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.747349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/18de8afb-4408-44de-84c3-f6c2f557063c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gk98j\" (UID: \"18de8afb-4408-44de-84c3-f6c2f557063c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:28.756810 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.756786 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9gvq\" (UniqueName: \"kubernetes.io/projected/18de8afb-4408-44de-84c3-f6c2f557063c-kube-api-access-m9gvq\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-gk98j\" (UID: \"18de8afb-4408-44de-84c3-f6c2f557063c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:28.757032 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.757015 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkj7b\" (UniqueName: \"kubernetes.io/projected/ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259-kube-api-access-qkj7b\") pod \"limitador-operator-controller-manager-85c4996f8c-kg69x\" (UID: \"ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" Apr 16 23:34:28.847775 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.847743 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2xms\" (UniqueName: \"kubernetes.io/projected/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-kube-api-access-k2xms\") pod \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\" (UID: \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\") " Apr 16 23:34:28.847863 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.847795 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj688\" (UniqueName: \"kubernetes.io/projected/3173c993-c6b1-4497-8bc3-b186b9f09913-kube-api-access-lj688\") pod \"3173c993-c6b1-4497-8bc3-b186b9f09913\" (UID: \"3173c993-c6b1-4497-8bc3-b186b9f09913\") " Apr 16 23:34:28.847863 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.847825 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-extensions-socket-volume\") pod \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\" (UID: \"2471dbe9-4c99-4f27-92bb-4231ac85a4bf\") " Apr 16 23:34:28.848257 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.848229 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "2471dbe9-4c99-4f27-92bb-4231ac85a4bf" (UID: "2471dbe9-4c99-4f27-92bb-4231ac85a4bf"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:34:28.849525 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.849502 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-kube-api-access-k2xms" (OuterVolumeSpecName: "kube-api-access-k2xms") pod "2471dbe9-4c99-4f27-92bb-4231ac85a4bf" (UID: "2471dbe9-4c99-4f27-92bb-4231ac85a4bf"). InnerVolumeSpecName "kube-api-access-k2xms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:34:28.849717 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.849701 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3173c993-c6b1-4497-8bc3-b186b9f09913-kube-api-access-lj688" (OuterVolumeSpecName: "kube-api-access-lj688") pod "3173c993-c6b1-4497-8bc3-b186b9f09913" (UID: "3173c993-c6b1-4497-8bc3-b186b9f09913"). InnerVolumeSpecName "kube-api-access-lj688". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:34:28.869981 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.869961 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:28.875604 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.875583 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" Apr 16 23:34:28.949325 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.949257 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lj688\" (UniqueName: \"kubernetes.io/projected/3173c993-c6b1-4497-8bc3-b186b9f09913-kube-api-access-lj688\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:34:28.949325 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.949287 2573 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-extensions-socket-volume\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:34:28.949325 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:28.949302 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k2xms\" (UniqueName: \"kubernetes.io/projected/2471dbe9-4c99-4f27-92bb-4231ac85a4bf-kube-api-access-k2xms\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:34:29.002096 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.002063 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j"] Apr 16 23:34:29.004792 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:34:29.004767 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18de8afb_4408_44de_84c3_f6c2f557063c.slice/crio-ec5c7ef89cb9c70e241c0bf205e501faf6741c17503de380a100426ec4ec9377 WatchSource:0}: Error finding container ec5c7ef89cb9c70e241c0bf205e501faf6741c17503de380a100426ec4ec9377: Status 404 returned error can't find the container with id ec5c7ef89cb9c70e241c0bf205e501faf6741c17503de380a100426ec4ec9377 Apr 16 23:34:29.058333 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.058302 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2471dbe9-4c99-4f27-92bb-4231ac85a4bf" path="/var/lib/kubelet/pods/2471dbe9-4c99-4f27-92bb-4231ac85a4bf/volumes" Apr 16 23:34:29.058657 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.058636 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3173c993-c6b1-4497-8bc3-b186b9f09913" path="/var/lib/kubelet/pods/3173c993-c6b1-4497-8bc3-b186b9f09913/volumes" Apr 16 23:34:29.226907 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.226886 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x"] Apr 16 23:34:29.229469 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:34:29.229442 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2c3a5c_ba2b_4ae2_bd5e_71ab11f49259.slice/crio-ef8596057e1d5629a3ea1aa4bf57ccda6b70261ece3444a948cb64de996a4e21 WatchSource:0}: Error finding container ef8596057e1d5629a3ea1aa4bf57ccda6b70261ece3444a948cb64de996a4e21: Status 404 returned error can't find the container with id ef8596057e1d5629a3ea1aa4bf57ccda6b70261ece3444a948cb64de996a4e21 Apr 16 23:34:29.625592 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.625495 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" event={"ID":"ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259","Type":"ContainerStarted","Data":"72f96578c5c1b627b802cdfb5a80a67cc20efb2e6032da5c192a97cf43ed0b16"} Apr 16 23:34:29.625592 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.625552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" event={"ID":"ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259","Type":"ContainerStarted","Data":"ef8596057e1d5629a3ea1aa4bf57ccda6b70261ece3444a948cb64de996a4e21"} Apr 16 23:34:29.626030 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.625666 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" Apr 16 23:34:29.626694 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.626678 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-95tz9" Apr 16 23:34:29.626694 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.626690 2573 scope.go:117] "RemoveContainer" containerID="816a27873bd26bc36df73cf8a76facdf90a61e9a439d5378bd2cc36cd2c1b096" Apr 16 23:34:29.628156 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.628135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" event={"ID":"18de8afb-4408-44de-84c3-f6c2f557063c","Type":"ContainerStarted","Data":"4fdd4e851d03c25ebf351630fc9136e367fa41539501f9d167a54600b4b345bb"} Apr 16 23:34:29.628253 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.628161 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" event={"ID":"18de8afb-4408-44de-84c3-f6c2f557063c","Type":"ContainerStarted","Data":"ec5c7ef89cb9c70e241c0bf205e501faf6741c17503de380a100426ec4ec9377"} Apr 16 23:34:29.628253 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.628217 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:29.629520 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.629501 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-qpxbf" Apr 16 23:34:29.635069 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.635054 2573 scope.go:117] "RemoveContainer" containerID="7a803eea044471e49bfc3483572a41db079d2b7b1e42fb8f364d74b48abb60af" Apr 16 23:34:29.644984 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.644948 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" podStartSLOduration=1.644938228 podStartE2EDuration="1.644938228s" podCreationTimestamp="2026-04-16 23:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:34:29.642000902 +0000 UTC m=+477.243920004" watchObservedRunningTime="2026-04-16 23:34:29.644938228 +0000 UTC m=+477.246857327" Apr 16 23:34:29.661016 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:29.660977 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" podStartSLOduration=1.660964538 podStartE2EDuration="1.660964538s" podCreationTimestamp="2026-04-16 23:34:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:34:29.659181643 +0000 UTC m=+477.261100767" watchObservedRunningTime="2026-04-16 23:34:29.660964538 +0000 UTC m=+477.262883639" Apr 16 23:34:40.636639 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:40.636604 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-kg69x" Apr 16 23:34:40.637133 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:40.636758 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:45.564092 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.564058 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j"] Apr 16 23:34:45.564479 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.564284 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" podUID="18de8afb-4408-44de-84c3-f6c2f557063c" containerName="manager" containerID="cri-o://4fdd4e851d03c25ebf351630fc9136e367fa41539501f9d167a54600b4b345bb" gracePeriod=10 Apr 16 23:34:45.685004 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.684975 2573 generic.go:358] "Generic (PLEG): container finished" podID="18de8afb-4408-44de-84c3-f6c2f557063c" containerID="4fdd4e851d03c25ebf351630fc9136e367fa41539501f9d167a54600b4b345bb" exitCode=0 Apr 16 23:34:45.685098 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.685045 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" event={"ID":"18de8afb-4408-44de-84c3-f6c2f557063c","Type":"ContainerDied","Data":"4fdd4e851d03c25ebf351630fc9136e367fa41539501f9d167a54600b4b345bb"} Apr 16 23:34:45.799304 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.799281 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:45.874742 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.874683 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9gvq\" (UniqueName: \"kubernetes.io/projected/18de8afb-4408-44de-84c3-f6c2f557063c-kube-api-access-m9gvq\") pod \"18de8afb-4408-44de-84c3-f6c2f557063c\" (UID: \"18de8afb-4408-44de-84c3-f6c2f557063c\") " Apr 16 23:34:45.874742 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.874718 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/18de8afb-4408-44de-84c3-f6c2f557063c-extensions-socket-volume\") pod \"18de8afb-4408-44de-84c3-f6c2f557063c\" (UID: \"18de8afb-4408-44de-84c3-f6c2f557063c\") " Apr 16 23:34:45.875071 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.875047 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18de8afb-4408-44de-84c3-f6c2f557063c-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "18de8afb-4408-44de-84c3-f6c2f557063c" (UID: "18de8afb-4408-44de-84c3-f6c2f557063c"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 23:34:45.876626 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.876603 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18de8afb-4408-44de-84c3-f6c2f557063c-kube-api-access-m9gvq" (OuterVolumeSpecName: "kube-api-access-m9gvq") pod "18de8afb-4408-44de-84c3-f6c2f557063c" (UID: "18de8afb-4408-44de-84c3-f6c2f557063c"). InnerVolumeSpecName "kube-api-access-m9gvq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:34:45.975853 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.975826 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m9gvq\" (UniqueName: \"kubernetes.io/projected/18de8afb-4408-44de-84c3-f6c2f557063c-kube-api-access-m9gvq\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:34:45.975853 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:45.975849 2573 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/18de8afb-4408-44de-84c3-f6c2f557063c-extensions-socket-volume\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:34:46.689504 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:46.689481 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" Apr 16 23:34:46.689504 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:46.689500 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j" event={"ID":"18de8afb-4408-44de-84c3-f6c2f557063c","Type":"ContainerDied","Data":"ec5c7ef89cb9c70e241c0bf205e501faf6741c17503de380a100426ec4ec9377"} Apr 16 23:34:46.689994 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:46.689562 2573 scope.go:117] "RemoveContainer" containerID="4fdd4e851d03c25ebf351630fc9136e367fa41539501f9d167a54600b4b345bb" Apr 16 23:34:46.712065 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:46.712043 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j"] Apr 16 23:34:46.716009 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:46.715987 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-gk98j"] Apr 16 23:34:47.058846 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:34:47.058775 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18de8afb-4408-44de-84c3-f6c2f557063c" path="/var/lib/kubelet/pods/18de8afb-4408-44de-84c3-f6c2f557063c/volumes" Apr 16 23:35:01.741651 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.741611 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2"] Apr 16 23:35:01.742183 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.741998 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18de8afb-4408-44de-84c3-f6c2f557063c" containerName="manager" Apr 16 23:35:01.742183 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.742013 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="18de8afb-4408-44de-84c3-f6c2f557063c" containerName="manager" Apr 16 23:35:01.742183 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.742086 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="18de8afb-4408-44de-84c3-f6c2f557063c" containerName="manager" Apr 16 23:35:01.747837 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.747812 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.750230 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.750204 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-4v7n6\"" Apr 16 23:35:01.755869 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.755840 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2"] Apr 16 23:35:01.788629 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.788606 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/dfe10a9a-9555-4894-830a-e5027f15736c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.788753 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.788641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dfe10a9a-9555-4894-830a-e5027f15736c-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.788753 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.788663 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.788849 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.788752 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.788849 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.788785 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dfe10a9a-9555-4894-830a-e5027f15736c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.788849 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.788819 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.788849 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.788839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.789039 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.788856 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpzr8\" (UniqueName: \"kubernetes.io/projected/dfe10a9a-9555-4894-830a-e5027f15736c-kube-api-access-tpzr8\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.789039 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.788904 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890235 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890355 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890355 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890301 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dfe10a9a-9555-4894-830a-e5027f15736c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890355 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890332 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890490 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890557 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tpzr8\" (UniqueName: \"kubernetes.io/projected/dfe10a9a-9555-4894-830a-e5027f15736c-kube-api-access-tpzr8\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890672 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890672 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/dfe10a9a-9555-4894-830a-e5027f15736c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890672 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890660 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dfe10a9a-9555-4894-830a-e5027f15736c-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890784 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890658 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890784 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890681 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.890958 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.891013 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.890961 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.891515 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.891490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/dfe10a9a-9555-4894-830a-e5027f15736c-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.892632 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.892603 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/dfe10a9a-9555-4894-830a-e5027f15736c-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.892827 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.892804 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/dfe10a9a-9555-4894-830a-e5027f15736c-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.897766 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.897741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpzr8\" (UniqueName: \"kubernetes.io/projected/dfe10a9a-9555-4894-830a-e5027f15736c-kube-api-access-tpzr8\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:01.897845 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:01.897817 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/dfe10a9a-9555-4894-830a-e5027f15736c-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wcqv2\" (UID: \"dfe10a9a-9555-4894-830a-e5027f15736c\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:02.060612 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:02.060557 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:02.184419 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:02.184392 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2"] Apr 16 23:35:02.186993 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:35:02.186964 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe10a9a_9555_4894_830a_e5027f15736c.slice/crio-a6517662d66c3c32a73fcf34d62b423958e3da5cd0ff529a239003a9549faa6a WatchSource:0}: Error finding container a6517662d66c3c32a73fcf34d62b423958e3da5cd0ff529a239003a9549faa6a: Status 404 returned error can't find the container with id a6517662d66c3c32a73fcf34d62b423958e3da5cd0ff529a239003a9549faa6a Apr 16 23:35:02.189522 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:02.189473 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 23:35:02.189640 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:02.189593 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 23:35:02.189703 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:02.189640 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 23:35:02.749545 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:02.749499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" event={"ID":"dfe10a9a-9555-4894-830a-e5027f15736c","Type":"ContainerStarted","Data":"298434f245674c6d565b2c2b81e4165a90ae0e241d9278b612eb388ac83e7ce1"} Apr 16 23:35:02.749545 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:02.749531 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" event={"ID":"dfe10a9a-9555-4894-830a-e5027f15736c","Type":"ContainerStarted","Data":"a6517662d66c3c32a73fcf34d62b423958e3da5cd0ff529a239003a9549faa6a"} Apr 16 23:35:02.773860 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:02.773816 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" podStartSLOduration=1.7738009080000001 podStartE2EDuration="1.773800908s" podCreationTimestamp="2026-04-16 23:35:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 23:35:02.772257183 +0000 UTC m=+510.374176285" watchObservedRunningTime="2026-04-16 23:35:02.773800908 +0000 UTC m=+510.375720010" Apr 16 23:35:03.060792 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:03.060718 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:03.066128 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:03.066106 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:03.752758 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:03.752731 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:03.753743 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:03.753724 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wcqv2" Apr 16 23:35:06.036598 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.036568 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mqq6v"] Apr 16 23:35:06.039938 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.039918 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:06.042505 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.042472 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 23:35:06.042606 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.042591 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-26zbs\"" Apr 16 23:35:06.052015 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.051994 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mqq6v"] Apr 16 23:35:06.123878 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.123850 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-config-file\") pod \"limitador-limitador-7d549b5b-mqq6v\" (UID: \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:06.124018 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.123890 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99s9d\" (UniqueName: \"kubernetes.io/projected/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-kube-api-access-99s9d\") pod \"limitador-limitador-7d549b5b-mqq6v\" (UID: \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:06.140502 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.140471 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mqq6v"] Apr 16 23:35:06.224490 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.224465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-config-file\") pod \"limitador-limitador-7d549b5b-mqq6v\" (UID: \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:06.224605 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.224496 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99s9d\" (UniqueName: \"kubernetes.io/projected/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-kube-api-access-99s9d\") pod \"limitador-limitador-7d549b5b-mqq6v\" (UID: \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:06.225076 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.225060 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-config-file\") pod \"limitador-limitador-7d549b5b-mqq6v\" (UID: \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:06.232995 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.232973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99s9d\" (UniqueName: \"kubernetes.io/projected/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-kube-api-access-99s9d\") pod \"limitador-limitador-7d549b5b-mqq6v\" (UID: \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\") " pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:06.350895 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.350833 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:06.475181 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.475161 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mqq6v"] Apr 16 23:35:06.477723 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:35:06.477695 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4311ef3_32b7_4cbe_8b9a_48b0defe4187.slice/crio-757409043a16ad00565271e1350547bc8e04fd7452415e3e2b9af3441eba3fd7 WatchSource:0}: Error finding container 757409043a16ad00565271e1350547bc8e04fd7452415e3e2b9af3441eba3fd7: Status 404 returned error can't find the container with id 757409043a16ad00565271e1350547bc8e04fd7452415e3e2b9af3441eba3fd7 Apr 16 23:35:06.763120 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:06.763089 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" event={"ID":"c4311ef3-32b7-4cbe-8b9a-48b0defe4187","Type":"ContainerStarted","Data":"757409043a16ad00565271e1350547bc8e04fd7452415e3e2b9af3441eba3fd7"} Apr 16 23:35:09.777143 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:09.777109 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" event={"ID":"c4311ef3-32b7-4cbe-8b9a-48b0defe4187","Type":"ContainerStarted","Data":"0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e"} Apr 16 23:35:09.777527 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:09.777182 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:09.794188 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:09.794133 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" podStartSLOduration=0.892404073 podStartE2EDuration="3.794117421s" podCreationTimestamp="2026-04-16 23:35:06 +0000 UTC" firstStartedPulling="2026-04-16 23:35:06.479597158 +0000 UTC m=+514.081516236" lastFinishedPulling="2026-04-16 23:35:09.381310487 +0000 UTC m=+516.983229584" observedRunningTime="2026-04-16 23:35:09.792713078 +0000 UTC m=+517.394632208" watchObservedRunningTime="2026-04-16 23:35:09.794117421 +0000 UTC m=+517.396036523" Apr 16 23:35:20.781372 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:20.781292 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:22.094808 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.094772 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mqq6v"] Apr 16 23:35:22.095246 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.095032 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" podUID="c4311ef3-32b7-4cbe-8b9a-48b0defe4187" containerName="limitador" containerID="cri-o://0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e" gracePeriod=30 Apr 16 23:35:22.636667 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.636646 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:22.759215 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.759185 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-config-file\") pod \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\" (UID: \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\") " Apr 16 23:35:22.759349 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.759229 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99s9d\" (UniqueName: \"kubernetes.io/projected/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-kube-api-access-99s9d\") pod \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\" (UID: \"c4311ef3-32b7-4cbe-8b9a-48b0defe4187\") " Apr 16 23:35:22.759581 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.759523 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-config-file" (OuterVolumeSpecName: "config-file") pod "c4311ef3-32b7-4cbe-8b9a-48b0defe4187" (UID: "c4311ef3-32b7-4cbe-8b9a-48b0defe4187"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 23:35:22.761308 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.761284 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-kube-api-access-99s9d" (OuterVolumeSpecName: "kube-api-access-99s9d") pod "c4311ef3-32b7-4cbe-8b9a-48b0defe4187" (UID: "c4311ef3-32b7-4cbe-8b9a-48b0defe4187"). InnerVolumeSpecName "kube-api-access-99s9d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:35:22.830064 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.830038 2573 generic.go:358] "Generic (PLEG): container finished" podID="c4311ef3-32b7-4cbe-8b9a-48b0defe4187" containerID="0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e" exitCode=0 Apr 16 23:35:22.830158 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.830100 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" event={"ID":"c4311ef3-32b7-4cbe-8b9a-48b0defe4187","Type":"ContainerDied","Data":"0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e"} Apr 16 23:35:22.830158 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.830106 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" Apr 16 23:35:22.830158 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.830131 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-mqq6v" event={"ID":"c4311ef3-32b7-4cbe-8b9a-48b0defe4187","Type":"ContainerDied","Data":"757409043a16ad00565271e1350547bc8e04fd7452415e3e2b9af3441eba3fd7"} Apr 16 23:35:22.830158 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.830147 2573 scope.go:117] "RemoveContainer" containerID="0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e" Apr 16 23:35:22.838395 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.838375 2573 scope.go:117] "RemoveContainer" containerID="0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e" Apr 16 23:35:22.838731 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:35:22.838714 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e\": container with ID starting with 0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e not found: ID does not exist" containerID="0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e" Apr 16 23:35:22.838783 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.838739 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e"} err="failed to get container status \"0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e\": rpc error: code = NotFound desc = could not find container \"0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e\": container with ID starting with 0f3ae66d9f6c5f834a518a3c222633e5313d1a2df1374c5dcb14942f2e17281e not found: ID does not exist" Apr 16 23:35:22.851025 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.851002 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mqq6v"] Apr 16 23:35:22.855043 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.855021 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-mqq6v"] Apr 16 23:35:22.860526 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.860502 2573 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-config-file\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:35:22.860526 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:22.860525 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-99s9d\" (UniqueName: \"kubernetes.io/projected/c4311ef3-32b7-4cbe-8b9a-48b0defe4187-kube-api-access-99s9d\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:35:23.059235 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:23.059159 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4311ef3-32b7-4cbe-8b9a-48b0defe4187" path="/var/lib/kubelet/pods/c4311ef3-32b7-4cbe-8b9a-48b0defe4187/volumes" Apr 16 23:35:27.147863 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.147832 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-p6rfz"] Apr 16 23:35:27.148229 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.148192 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c4311ef3-32b7-4cbe-8b9a-48b0defe4187" containerName="limitador" Apr 16 23:35:27.148229 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.148203 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4311ef3-32b7-4cbe-8b9a-48b0defe4187" containerName="limitador" Apr 16 23:35:27.148315 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.148264 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c4311ef3-32b7-4cbe-8b9a-48b0defe4187" containerName="limitador" Apr 16 23:35:27.152468 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.152444 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:27.154628 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.154605 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 23:35:27.154753 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.154628 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-4c7g6\"" Apr 16 23:35:27.158345 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.158323 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-p6rfz"] Apr 16 23:35:27.292578 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.292523 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zfvq\" (UniqueName: \"kubernetes.io/projected/18144aff-a024-41c0-9d7e-5f920d8c7582-kube-api-access-7zfvq\") pod \"postgres-868db5846d-p6rfz\" (UID: \"18144aff-a024-41c0-9d7e-5f920d8c7582\") " pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:27.292735 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.292687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18144aff-a024-41c0-9d7e-5f920d8c7582-data\") pod \"postgres-868db5846d-p6rfz\" (UID: \"18144aff-a024-41c0-9d7e-5f920d8c7582\") " pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:27.393567 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.393512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18144aff-a024-41c0-9d7e-5f920d8c7582-data\") pod \"postgres-868db5846d-p6rfz\" (UID: \"18144aff-a024-41c0-9d7e-5f920d8c7582\") " pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:27.393700 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.393653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zfvq\" (UniqueName: \"kubernetes.io/projected/18144aff-a024-41c0-9d7e-5f920d8c7582-kube-api-access-7zfvq\") pod \"postgres-868db5846d-p6rfz\" (UID: \"18144aff-a024-41c0-9d7e-5f920d8c7582\") " pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:27.393936 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.393915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/18144aff-a024-41c0-9d7e-5f920d8c7582-data\") pod \"postgres-868db5846d-p6rfz\" (UID: \"18144aff-a024-41c0-9d7e-5f920d8c7582\") " pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:27.400892 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.400832 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zfvq\" (UniqueName: \"kubernetes.io/projected/18144aff-a024-41c0-9d7e-5f920d8c7582-kube-api-access-7zfvq\") pod \"postgres-868db5846d-p6rfz\" (UID: \"18144aff-a024-41c0-9d7e-5f920d8c7582\") " pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:27.463792 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.463762 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:27.582254 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.582232 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-p6rfz"] Apr 16 23:35:27.584661 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:35:27.584635 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18144aff_a024_41c0_9d7e_5f920d8c7582.slice/crio-96c27d9b5e18980439c0916fc8e0bad4af778e5d4a400d40b119bde7fd299021 WatchSource:0}: Error finding container 96c27d9b5e18980439c0916fc8e0bad4af778e5d4a400d40b119bde7fd299021: Status 404 returned error can't find the container with id 96c27d9b5e18980439c0916fc8e0bad4af778e5d4a400d40b119bde7fd299021 Apr 16 23:35:27.849824 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:27.849795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-p6rfz" event={"ID":"18144aff-a024-41c0-9d7e-5f920d8c7582","Type":"ContainerStarted","Data":"96c27d9b5e18980439c0916fc8e0bad4af778e5d4a400d40b119bde7fd299021"} Apr 16 23:35:32.872281 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:32.872210 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-p6rfz" event={"ID":"18144aff-a024-41c0-9d7e-5f920d8c7582","Type":"ContainerStarted","Data":"19c0234850033264e0f67d2cf60cdc0fe57dfca9a9d411c9216f85c30eb04582"} Apr 16 23:35:32.872646 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:32.872347 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:32.890662 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:32.890616 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-p6rfz" podStartSLOduration=0.974975877 podStartE2EDuration="5.890602114s" podCreationTimestamp="2026-04-16 23:35:27 +0000 UTC" firstStartedPulling="2026-04-16 23:35:27.586005538 +0000 UTC m=+535.187924617" lastFinishedPulling="2026-04-16 23:35:32.501631772 +0000 UTC m=+540.103550854" observedRunningTime="2026-04-16 23:35:32.889204766 +0000 UTC m=+540.491123865" watchObservedRunningTime="2026-04-16 23:35:32.890602114 +0000 UTC m=+540.492521215" Apr 16 23:35:38.903825 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:38.903795 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-p6rfz" Apr 16 23:35:39.878996 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.878965 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-66b96f9c8-g6g22"] Apr 16 23:35:39.882187 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.882166 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:39.893896 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.893877 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 23:35:39.893992 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.893880 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-l2dc4\"" Apr 16 23:35:39.895820 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.895805 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 23:35:39.900970 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.900952 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-66b96f9c8-g6g22"] Apr 16 23:35:39.951959 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.951934 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6b7fb74478-jr42z"] Apr 16 23:35:39.954119 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.954103 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b7fb74478-jr42z" Apr 16 23:35:39.956744 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.956727 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-wf8l7\"" Apr 16 23:35:39.970678 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.970593 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6b7fb74478-jr42z"] Apr 16 23:35:39.997916 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.997892 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c8d9\" (UniqueName: \"kubernetes.io/projected/06671266-c86a-47b0-b30f-08d19e7a9a5d-kube-api-access-6c8d9\") pod \"maas-api-66b96f9c8-g6g22\" (UID: \"06671266-c86a-47b0-b30f-08d19e7a9a5d\") " pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:39.998033 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:39.997948 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/06671266-c86a-47b0-b30f-08d19e7a9a5d-maas-api-tls\") pod \"maas-api-66b96f9c8-g6g22\" (UID: \"06671266-c86a-47b0-b30f-08d19e7a9a5d\") " pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:40.099006 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.098980 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5hjz\" (UniqueName: \"kubernetes.io/projected/f7144126-e112-4369-a074-6f9171fa05df-kube-api-access-c5hjz\") pod \"maas-controller-6b7fb74478-jr42z\" (UID: \"f7144126-e112-4369-a074-6f9171fa05df\") " pod="opendatahub/maas-controller-6b7fb74478-jr42z" Apr 16 23:35:40.099156 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.099022 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6c8d9\" (UniqueName: \"kubernetes.io/projected/06671266-c86a-47b0-b30f-08d19e7a9a5d-kube-api-access-6c8d9\") pod \"maas-api-66b96f9c8-g6g22\" (UID: \"06671266-c86a-47b0-b30f-08d19e7a9a5d\") " pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:40.099156 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.099102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/06671266-c86a-47b0-b30f-08d19e7a9a5d-maas-api-tls\") pod \"maas-api-66b96f9c8-g6g22\" (UID: \"06671266-c86a-47b0-b30f-08d19e7a9a5d\") " pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:40.101552 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.101521 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/06671266-c86a-47b0-b30f-08d19e7a9a5d-maas-api-tls\") pod \"maas-api-66b96f9c8-g6g22\" (UID: \"06671266-c86a-47b0-b30f-08d19e7a9a5d\") " pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:40.110478 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.110458 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c8d9\" (UniqueName: \"kubernetes.io/projected/06671266-c86a-47b0-b30f-08d19e7a9a5d-kube-api-access-6c8d9\") pod \"maas-api-66b96f9c8-g6g22\" (UID: \"06671266-c86a-47b0-b30f-08d19e7a9a5d\") " pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:40.191958 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.191934 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:40.200003 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.199978 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5hjz\" (UniqueName: \"kubernetes.io/projected/f7144126-e112-4369-a074-6f9171fa05df-kube-api-access-c5hjz\") pod \"maas-controller-6b7fb74478-jr42z\" (UID: \"f7144126-e112-4369-a074-6f9171fa05df\") " pod="opendatahub/maas-controller-6b7fb74478-jr42z" Apr 16 23:35:40.214021 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.213991 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5hjz\" (UniqueName: \"kubernetes.io/projected/f7144126-e112-4369-a074-6f9171fa05df-kube-api-access-c5hjz\") pod \"maas-controller-6b7fb74478-jr42z\" (UID: \"f7144126-e112-4369-a074-6f9171fa05df\") " pod="opendatahub/maas-controller-6b7fb74478-jr42z" Apr 16 23:35:40.265099 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.264631 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b7fb74478-jr42z" Apr 16 23:35:40.354398 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.354370 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-66b96f9c8-g6g22"] Apr 16 23:35:40.356569 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:35:40.356520 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06671266_c86a_47b0_b30f_08d19e7a9a5d.slice/crio-a6331bb0cd8d9d399b39b9a8c3ce9b480f56d422dcfaa441736e668b714dd1e7 WatchSource:0}: Error finding container a6331bb0cd8d9d399b39b9a8c3ce9b480f56d422dcfaa441736e668b714dd1e7: Status 404 returned error can't find the container with id a6331bb0cd8d9d399b39b9a8c3ce9b480f56d422dcfaa441736e668b714dd1e7 Apr 16 23:35:40.406874 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.406845 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6b7fb74478-jr42z"] Apr 16 23:35:40.409659 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:35:40.409623 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7144126_e112_4369_a074_6f9171fa05df.slice/crio-4a65e49e734a14cc20ff7247f4dc1751b4507911e341eb5b302fa7210dfe5463 WatchSource:0}: Error finding container 4a65e49e734a14cc20ff7247f4dc1751b4507911e341eb5b302fa7210dfe5463: Status 404 returned error can't find the container with id 4a65e49e734a14cc20ff7247f4dc1751b4507911e341eb5b302fa7210dfe5463 Apr 16 23:35:40.899773 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.899741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b7fb74478-jr42z" event={"ID":"f7144126-e112-4369-a074-6f9171fa05df","Type":"ContainerStarted","Data":"4a65e49e734a14cc20ff7247f4dc1751b4507911e341eb5b302fa7210dfe5463"} Apr 16 23:35:40.901003 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.900971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66b96f9c8-g6g22" event={"ID":"06671266-c86a-47b0-b30f-08d19e7a9a5d","Type":"ContainerStarted","Data":"a6331bb0cd8d9d399b39b9a8c3ce9b480f56d422dcfaa441736e668b714dd1e7"} Apr 16 23:35:40.906930 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.906909 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7ffd5c4797-w8z9h"] Apr 16 23:35:40.931766 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.931736 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7ffd5c4797-w8z9h"] Apr 16 23:35:40.932160 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:40.932142 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:41.007080 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:41.007048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7ecf3e0a-e474-4329-bcb4-f372306d8c76-maas-api-tls\") pod \"maas-api-7ffd5c4797-w8z9h\" (UID: \"7ecf3e0a-e474-4329-bcb4-f372306d8c76\") " pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:41.007465 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:41.007115 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h526d\" (UniqueName: \"kubernetes.io/projected/7ecf3e0a-e474-4329-bcb4-f372306d8c76-kube-api-access-h526d\") pod \"maas-api-7ffd5c4797-w8z9h\" (UID: \"7ecf3e0a-e474-4329-bcb4-f372306d8c76\") " pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:41.107989 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:41.107946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7ecf3e0a-e474-4329-bcb4-f372306d8c76-maas-api-tls\") pod \"maas-api-7ffd5c4797-w8z9h\" (UID: \"7ecf3e0a-e474-4329-bcb4-f372306d8c76\") " pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:41.108162 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:41.108020 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h526d\" (UniqueName: \"kubernetes.io/projected/7ecf3e0a-e474-4329-bcb4-f372306d8c76-kube-api-access-h526d\") pod \"maas-api-7ffd5c4797-w8z9h\" (UID: \"7ecf3e0a-e474-4329-bcb4-f372306d8c76\") " pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:41.114687 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:41.114646 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/7ecf3e0a-e474-4329-bcb4-f372306d8c76-maas-api-tls\") pod \"maas-api-7ffd5c4797-w8z9h\" (UID: \"7ecf3e0a-e474-4329-bcb4-f372306d8c76\") " pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:41.147207 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:41.147172 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h526d\" (UniqueName: \"kubernetes.io/projected/7ecf3e0a-e474-4329-bcb4-f372306d8c76-kube-api-access-h526d\") pod \"maas-api-7ffd5c4797-w8z9h\" (UID: \"7ecf3e0a-e474-4329-bcb4-f372306d8c76\") " pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:41.251307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:41.251272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:41.502211 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:41.502123 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7ffd5c4797-w8z9h"] Apr 16 23:35:41.508510 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:35:41.508478 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ecf3e0a_e474_4329_bcb4_f372306d8c76.slice/crio-74ea7afe29e5ebaf7fb9983480b13de7502b36f3711e4acdba20f635ceb3f100 WatchSource:0}: Error finding container 74ea7afe29e5ebaf7fb9983480b13de7502b36f3711e4acdba20f635ceb3f100: Status 404 returned error can't find the container with id 74ea7afe29e5ebaf7fb9983480b13de7502b36f3711e4acdba20f635ceb3f100 Apr 16 23:35:41.909302 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:41.909198 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7ffd5c4797-w8z9h" event={"ID":"7ecf3e0a-e474-4329-bcb4-f372306d8c76","Type":"ContainerStarted","Data":"74ea7afe29e5ebaf7fb9983480b13de7502b36f3711e4acdba20f635ceb3f100"} Apr 16 23:35:44.929831 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:44.929791 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b7fb74478-jr42z" event={"ID":"f7144126-e112-4369-a074-6f9171fa05df","Type":"ContainerStarted","Data":"0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309"} Apr 16 23:35:44.930278 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:44.929868 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6b7fb74478-jr42z" Apr 16 23:35:44.931211 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:44.931182 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66b96f9c8-g6g22" event={"ID":"06671266-c86a-47b0-b30f-08d19e7a9a5d","Type":"ContainerStarted","Data":"64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149"} Apr 16 23:35:44.931329 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:44.931304 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:44.932449 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:44.932430 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7ffd5c4797-w8z9h" event={"ID":"7ecf3e0a-e474-4329-bcb4-f372306d8c76","Type":"ContainerStarted","Data":"96a20f688ae25b59175591a79bc15b989068ca0f17017d4e7a5cc8410417e620"} Apr 16 23:35:44.932604 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:44.932589 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:44.958546 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:44.958503 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6b7fb74478-jr42z" podStartSLOduration=2.179300771 podStartE2EDuration="5.958489723s" podCreationTimestamp="2026-04-16 23:35:39 +0000 UTC" firstStartedPulling="2026-04-16 23:35:40.411176494 +0000 UTC m=+548.013095573" lastFinishedPulling="2026-04-16 23:35:44.190365439 +0000 UTC m=+551.792284525" observedRunningTime="2026-04-16 23:35:44.957374237 +0000 UTC m=+552.559293364" watchObservedRunningTime="2026-04-16 23:35:44.958489723 +0000 UTC m=+552.560408824" Apr 16 23:35:44.973406 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:44.973371 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-66b96f9c8-g6g22" podStartSLOduration=2.141314362 podStartE2EDuration="5.973361413s" podCreationTimestamp="2026-04-16 23:35:39 +0000 UTC" firstStartedPulling="2026-04-16 23:35:40.358322264 +0000 UTC m=+547.960241343" lastFinishedPulling="2026-04-16 23:35:44.190369316 +0000 UTC m=+551.792288394" observedRunningTime="2026-04-16 23:35:44.971858514 +0000 UTC m=+552.573777614" watchObservedRunningTime="2026-04-16 23:35:44.973361413 +0000 UTC m=+552.575280513" Apr 16 23:35:44.995441 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:44.995407 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7ffd5c4797-w8z9h" podStartSLOduration=2.30735527 podStartE2EDuration="4.995397129s" podCreationTimestamp="2026-04-16 23:35:40 +0000 UTC" firstStartedPulling="2026-04-16 23:35:41.511995482 +0000 UTC m=+549.113914567" lastFinishedPulling="2026-04-16 23:35:44.200037332 +0000 UTC m=+551.801956426" observedRunningTime="2026-04-16 23:35:44.99400041 +0000 UTC m=+552.595919513" watchObservedRunningTime="2026-04-16 23:35:44.995397129 +0000 UTC m=+552.597316273" Apr 16 23:35:50.941679 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:50.941646 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:50.942109 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:50.941707 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7ffd5c4797-w8z9h" Apr 16 23:35:50.996326 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:50.996293 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-66b96f9c8-g6g22"] Apr 16 23:35:50.996651 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:50.996598 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-66b96f9c8-g6g22" podUID="06671266-c86a-47b0-b30f-08d19e7a9a5d" containerName="maas-api" containerID="cri-o://64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149" gracePeriod=30 Apr 16 23:35:51.240201 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.240182 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:51.416549 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.416494 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/06671266-c86a-47b0-b30f-08d19e7a9a5d-maas-api-tls\") pod \"06671266-c86a-47b0-b30f-08d19e7a9a5d\" (UID: \"06671266-c86a-47b0-b30f-08d19e7a9a5d\") " Apr 16 23:35:51.416691 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.416665 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c8d9\" (UniqueName: \"kubernetes.io/projected/06671266-c86a-47b0-b30f-08d19e7a9a5d-kube-api-access-6c8d9\") pod \"06671266-c86a-47b0-b30f-08d19e7a9a5d\" (UID: \"06671266-c86a-47b0-b30f-08d19e7a9a5d\") " Apr 16 23:35:51.418643 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.418616 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06671266-c86a-47b0-b30f-08d19e7a9a5d-kube-api-access-6c8d9" (OuterVolumeSpecName: "kube-api-access-6c8d9") pod "06671266-c86a-47b0-b30f-08d19e7a9a5d" (UID: "06671266-c86a-47b0-b30f-08d19e7a9a5d"). InnerVolumeSpecName "kube-api-access-6c8d9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:35:51.418643 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.418621 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06671266-c86a-47b0-b30f-08d19e7a9a5d-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "06671266-c86a-47b0-b30f-08d19e7a9a5d" (UID: "06671266-c86a-47b0-b30f-08d19e7a9a5d"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 23:35:51.517665 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.517609 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6c8d9\" (UniqueName: \"kubernetes.io/projected/06671266-c86a-47b0-b30f-08d19e7a9a5d-kube-api-access-6c8d9\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:35:51.517665 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.517631 2573 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/06671266-c86a-47b0-b30f-08d19e7a9a5d-maas-api-tls\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:35:51.958321 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.958291 2573 generic.go:358] "Generic (PLEG): container finished" podID="06671266-c86a-47b0-b30f-08d19e7a9a5d" containerID="64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149" exitCode=0 Apr 16 23:35:51.958684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.958355 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-66b96f9c8-g6g22" Apr 16 23:35:51.958684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.958378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66b96f9c8-g6g22" event={"ID":"06671266-c86a-47b0-b30f-08d19e7a9a5d","Type":"ContainerDied","Data":"64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149"} Apr 16 23:35:51.958684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.958417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-66b96f9c8-g6g22" event={"ID":"06671266-c86a-47b0-b30f-08d19e7a9a5d","Type":"ContainerDied","Data":"a6331bb0cd8d9d399b39b9a8c3ce9b480f56d422dcfaa441736e668b714dd1e7"} Apr 16 23:35:51.958684 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.958434 2573 scope.go:117] "RemoveContainer" containerID="64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149" Apr 16 23:35:51.970387 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.970370 2573 scope.go:117] "RemoveContainer" containerID="64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149" Apr 16 23:35:51.970678 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:35:51.970659 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149\": container with ID starting with 64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149 not found: ID does not exist" containerID="64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149" Apr 16 23:35:51.970745 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.970688 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149"} err="failed to get container status \"64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149\": rpc error: code = NotFound desc = could not find container \"64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149\": container with ID starting with 64e2c2a9096090aaa858cc225817803e810c578eeebbc8734b2790bcccd5e149 not found: ID does not exist" Apr 16 23:35:51.980577 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.980553 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-66b96f9c8-g6g22"] Apr 16 23:35:51.984925 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:51.984903 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-66b96f9c8-g6g22"] Apr 16 23:35:53.058223 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:53.058192 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06671266-c86a-47b0-b30f-08d19e7a9a5d" path="/var/lib/kubelet/pods/06671266-c86a-47b0-b30f-08d19e7a9a5d/volumes" Apr 16 23:35:55.941623 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:55.941588 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6b7fb74478-jr42z" Apr 16 23:35:56.225975 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.225899 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-876898b5-847cb"] Apr 16 23:35:56.226307 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.226294 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="06671266-c86a-47b0-b30f-08d19e7a9a5d" containerName="maas-api" Apr 16 23:35:56.226351 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.226309 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="06671266-c86a-47b0-b30f-08d19e7a9a5d" containerName="maas-api" Apr 16 23:35:56.226383 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.226372 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="06671266-c86a-47b0-b30f-08d19e7a9a5d" containerName="maas-api" Apr 16 23:35:56.230887 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.230870 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-876898b5-847cb" Apr 16 23:35:56.234285 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.234258 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-876898b5-847cb"] Apr 16 23:35:56.251943 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.251910 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkl76\" (UniqueName: \"kubernetes.io/projected/baed2a62-318f-41b4-83b1-6c843b8d9644-kube-api-access-gkl76\") pod \"maas-controller-876898b5-847cb\" (UID: \"baed2a62-318f-41b4-83b1-6c843b8d9644\") " pod="opendatahub/maas-controller-876898b5-847cb" Apr 16 23:35:56.353078 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.353053 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkl76\" (UniqueName: \"kubernetes.io/projected/baed2a62-318f-41b4-83b1-6c843b8d9644-kube-api-access-gkl76\") pod \"maas-controller-876898b5-847cb\" (UID: \"baed2a62-318f-41b4-83b1-6c843b8d9644\") " pod="opendatahub/maas-controller-876898b5-847cb" Apr 16 23:35:56.362959 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.362940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkl76\" (UniqueName: \"kubernetes.io/projected/baed2a62-318f-41b4-83b1-6c843b8d9644-kube-api-access-gkl76\") pod \"maas-controller-876898b5-847cb\" (UID: \"baed2a62-318f-41b4-83b1-6c843b8d9644\") " pod="opendatahub/maas-controller-876898b5-847cb" Apr 16 23:35:56.543348 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.543290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-876898b5-847cb" Apr 16 23:35:56.664028 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.664006 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-876898b5-847cb"] Apr 16 23:35:56.666314 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:35:56.666287 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaed2a62_318f_41b4_83b1_6c843b8d9644.slice/crio-eb9402407a5213ab6a7c2de96cd71f064a000c5538bf1d01f7972a17107ffb57 WatchSource:0}: Error finding container eb9402407a5213ab6a7c2de96cd71f064a000c5538bf1d01f7972a17107ffb57: Status 404 returned error can't find the container with id eb9402407a5213ab6a7c2de96cd71f064a000c5538bf1d01f7972a17107ffb57 Apr 16 23:35:56.976594 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:56.976562 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-876898b5-847cb" event={"ID":"baed2a62-318f-41b4-83b1-6c843b8d9644","Type":"ContainerStarted","Data":"eb9402407a5213ab6a7c2de96cd71f064a000c5538bf1d01f7972a17107ffb57"} Apr 16 23:35:57.981339 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:57.981307 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-876898b5-847cb" event={"ID":"baed2a62-318f-41b4-83b1-6c843b8d9644","Type":"ContainerStarted","Data":"8130135ca78f7ad5dfb8d8feb8649c165339c2df69d5d0590bf720367cf2fe7b"} Apr 16 23:35:57.981716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:57.981362 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-876898b5-847cb" Apr 16 23:35:57.997072 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:35:57.997027 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-876898b5-847cb" podStartSLOduration=1.6172829800000001 podStartE2EDuration="1.99701469s" podCreationTimestamp="2026-04-16 23:35:56 +0000 UTC" firstStartedPulling="2026-04-16 23:35:56.668102351 +0000 UTC m=+564.270021429" lastFinishedPulling="2026-04-16 23:35:57.047834045 +0000 UTC m=+564.649753139" observedRunningTime="2026-04-16 23:35:57.994473289 +0000 UTC m=+565.596392464" watchObservedRunningTime="2026-04-16 23:35:57.99701469 +0000 UTC m=+565.598933791" Apr 16 23:36:08.990578 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:08.990520 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-876898b5-847cb" Apr 16 23:36:09.032306 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:09.032277 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6b7fb74478-jr42z"] Apr 16 23:36:09.032517 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:09.032492 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6b7fb74478-jr42z" podUID="f7144126-e112-4369-a074-6f9171fa05df" containerName="manager" containerID="cri-o://0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309" gracePeriod=10 Apr 16 23:36:09.271280 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:09.271258 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b7fb74478-jr42z" Apr 16 23:36:09.350824 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:09.350798 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5hjz\" (UniqueName: \"kubernetes.io/projected/f7144126-e112-4369-a074-6f9171fa05df-kube-api-access-c5hjz\") pod \"f7144126-e112-4369-a074-6f9171fa05df\" (UID: \"f7144126-e112-4369-a074-6f9171fa05df\") " Apr 16 23:36:09.352729 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:09.352702 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7144126-e112-4369-a074-6f9171fa05df-kube-api-access-c5hjz" (OuterVolumeSpecName: "kube-api-access-c5hjz") pod "f7144126-e112-4369-a074-6f9171fa05df" (UID: "f7144126-e112-4369-a074-6f9171fa05df"). InnerVolumeSpecName "kube-api-access-c5hjz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:36:09.452004 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:09.451978 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c5hjz\" (UniqueName: \"kubernetes.io/projected/f7144126-e112-4369-a074-6f9171fa05df-kube-api-access-c5hjz\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:36:10.025220 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:10.025184 2573 generic.go:358] "Generic (PLEG): container finished" podID="f7144126-e112-4369-a074-6f9171fa05df" containerID="0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309" exitCode=0 Apr 16 23:36:10.025594 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:10.025253 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b7fb74478-jr42z" Apr 16 23:36:10.025594 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:10.025267 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b7fb74478-jr42z" event={"ID":"f7144126-e112-4369-a074-6f9171fa05df","Type":"ContainerDied","Data":"0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309"} Apr 16 23:36:10.025594 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:10.025304 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6b7fb74478-jr42z" event={"ID":"f7144126-e112-4369-a074-6f9171fa05df","Type":"ContainerDied","Data":"4a65e49e734a14cc20ff7247f4dc1751b4507911e341eb5b302fa7210dfe5463"} Apr 16 23:36:10.025594 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:10.025319 2573 scope.go:117] "RemoveContainer" containerID="0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309" Apr 16 23:36:10.033443 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:10.033333 2573 scope.go:117] "RemoveContainer" containerID="0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309" Apr 16 23:36:10.033599 ip-10-0-136-153 kubenswrapper[2573]: E0416 23:36:10.033581 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309\": container with ID starting with 0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309 not found: ID does not exist" containerID="0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309" Apr 16 23:36:10.033668 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:10.033608 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309"} err="failed to get container status \"0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309\": rpc error: code = NotFound desc = could not find container \"0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309\": container with ID starting with 0743d54345c1866e655572dcfbda6c982669301ed7e40b15015e111cfa219309 not found: ID does not exist" Apr 16 23:36:10.046032 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:10.046009 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6b7fb74478-jr42z"] Apr 16 23:36:10.047699 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:10.047682 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6b7fb74478-jr42z"] Apr 16 23:36:11.058768 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:11.058735 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7144126-e112-4369-a074-6f9171fa05df" path="/var/lib/kubelet/pods/f7144126-e112-4369-a074-6f9171fa05df/volumes" Apr 16 23:36:47.077103 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.077025 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx"] Apr 16 23:36:47.077519 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.077403 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f7144126-e112-4369-a074-6f9171fa05df" containerName="manager" Apr 16 23:36:47.077519 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.077414 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7144126-e112-4369-a074-6f9171fa05df" containerName="manager" Apr 16 23:36:47.077519 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.077470 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f7144126-e112-4369-a074-6f9171fa05df" containerName="manager" Apr 16 23:36:47.079459 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.079444 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.081699 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.081672 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-k4454\"" Apr 16 23:36:47.082476 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.082445 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 23:36:47.082476 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.082462 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 23:36:47.082476 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.082445 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 23:36:47.090770 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.090751 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx"] Apr 16 23:36:47.248693 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.248665 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czdg2\" (UniqueName: \"kubernetes.io/projected/f5f75960-784c-4ec0-a1b7-72b83b7f1558-kube-api-access-czdg2\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.248848 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.248703 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f5f75960-784c-4ec0-a1b7-72b83b7f1558-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.248848 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.248749 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.248848 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.248793 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.248848 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.248828 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.248998 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.248852 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.349716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.349630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.349716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.349692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.349716 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.349712 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.349980 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.349736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.349980 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.349865 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czdg2\" (UniqueName: \"kubernetes.io/projected/f5f75960-784c-4ec0-a1b7-72b83b7f1558-kube-api-access-czdg2\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.349980 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.349900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f5f75960-784c-4ec0-a1b7-72b83b7f1558-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.350134 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.350073 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.350240 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.350215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.350431 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.350232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.352113 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.352091 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f5f75960-784c-4ec0-a1b7-72b83b7f1558-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.352255 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.352235 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f5f75960-784c-4ec0-a1b7-72b83b7f1558-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.357163 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.357144 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czdg2\" (UniqueName: \"kubernetes.io/projected/f5f75960-784c-4ec0-a1b7-72b83b7f1558-kube-api-access-czdg2\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx\" (UID: \"f5f75960-784c-4ec0-a1b7-72b83b7f1558\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.390148 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.390125 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:36:47.518507 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:47.518480 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx"] Apr 16 23:36:47.520847 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:36:47.520819 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5f75960_784c_4ec0_a1b7_72b83b7f1558.slice/crio-e437ce8a6d045fc85965768ec54cb52cc20da94129b019e20b06a9494c161529 WatchSource:0}: Error finding container e437ce8a6d045fc85965768ec54cb52cc20da94129b019e20b06a9494c161529: Status 404 returned error can't find the container with id e437ce8a6d045fc85965768ec54cb52cc20da94129b019e20b06a9494c161529 Apr 16 23:36:48.164712 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:48.164667 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" event={"ID":"f5f75960-784c-4ec0-a1b7-72b83b7f1558","Type":"ContainerStarted","Data":"e437ce8a6d045fc85965768ec54cb52cc20da94129b019e20b06a9494c161529"} Apr 16 23:36:53.187129 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:53.187089 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" event={"ID":"f5f75960-784c-4ec0-a1b7-72b83b7f1558","Type":"ContainerStarted","Data":"e49e42e09e38c6fd3f4173cadee61d126b1088702a7a8f7037f32458b6bd7bc3"} Apr 16 23:36:58.206225 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:58.206188 2573 generic.go:358] "Generic (PLEG): container finished" podID="f5f75960-784c-4ec0-a1b7-72b83b7f1558" containerID="e49e42e09e38c6fd3f4173cadee61d126b1088702a7a8f7037f32458b6bd7bc3" exitCode=0 Apr 16 23:36:58.206614 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:36:58.206263 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" event={"ID":"f5f75960-784c-4ec0-a1b7-72b83b7f1558","Type":"ContainerDied","Data":"e49e42e09e38c6fd3f4173cadee61d126b1088702a7a8f7037f32458b6bd7bc3"} Apr 16 23:37:03.231209 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:03.231171 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" event={"ID":"f5f75960-784c-4ec0-a1b7-72b83b7f1558","Type":"ContainerStarted","Data":"e0da3e239c1871bcbf0b58c6c9e61b20a95e5498730273a1592ad7b2c8bad3f0"} Apr 16 23:37:03.231632 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:03.231386 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:37:03.247668 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:03.247624 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" podStartSLOduration=1.034351067 podStartE2EDuration="16.247611158s" podCreationTimestamp="2026-04-16 23:36:47 +0000 UTC" firstStartedPulling="2026-04-16 23:36:47.522513444 +0000 UTC m=+615.124432523" lastFinishedPulling="2026-04-16 23:37:02.735773535 +0000 UTC m=+630.337692614" observedRunningTime="2026-04-16 23:37:03.24701096 +0000 UTC m=+630.848930060" watchObservedRunningTime="2026-04-16 23:37:03.247611158 +0000 UTC m=+630.849530260" Apr 16 23:37:14.248350 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.248313 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx" Apr 16 23:37:14.678116 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.678083 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7"] Apr 16 23:37:14.688295 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.688272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.690397 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.690373 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 16 23:37:14.691596 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.691568 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7"] Apr 16 23:37:14.788918 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.788882 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.788918 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.788915 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.789134 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.789036 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.789134 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.789069 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.789134 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.789087 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sxx7\" (UniqueName: \"kubernetes.io/projected/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-kube-api-access-6sxx7\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.789134 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.789122 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.889514 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.889482 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.889676 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.889575 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.889676 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.889601 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.889676 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.889619 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sxx7\" (UniqueName: \"kubernetes.io/projected/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-kube-api-access-6sxx7\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.889912 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.889886 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.889988 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.889970 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.890046 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.889992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.890247 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.890224 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.890331 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.890293 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.892305 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.892282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.892388 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.892282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.899198 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.899180 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sxx7\" (UniqueName: \"kubernetes.io/projected/d63fb101-969e-4d7e-b1f6-6ec73abaae7f-kube-api-access-6sxx7\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7\" (UID: \"d63fb101-969e-4d7e-b1f6-6ec73abaae7f\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:14.999758 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:14.999711 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:15.127740 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:15.127706 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7"] Apr 16 23:37:15.129962 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:37:15.129920 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd63fb101_969e_4d7e_b1f6_6ec73abaae7f.slice/crio-5225769db8fb898b554b42f9ecee42db160e1d14e4027ff370acab9c40b25909 WatchSource:0}: Error finding container 5225769db8fb898b554b42f9ecee42db160e1d14e4027ff370acab9c40b25909: Status 404 returned error can't find the container with id 5225769db8fb898b554b42f9ecee42db160e1d14e4027ff370acab9c40b25909 Apr 16 23:37:15.131783 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:15.131766 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 23:37:15.274687 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:15.274558 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" event={"ID":"d63fb101-969e-4d7e-b1f6-6ec73abaae7f","Type":"ContainerStarted","Data":"7641a6dbdf3398cff1ca448921ae85de6ce6e845687adf2d114e2207e931ec46"} Apr 16 23:37:15.274687 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:15.274605 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" event={"ID":"d63fb101-969e-4d7e-b1f6-6ec73abaae7f","Type":"ContainerStarted","Data":"5225769db8fb898b554b42f9ecee42db160e1d14e4027ff370acab9c40b25909"} Apr 16 23:37:21.303789 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:21.303751 2573 generic.go:358] "Generic (PLEG): container finished" podID="d63fb101-969e-4d7e-b1f6-6ec73abaae7f" containerID="7641a6dbdf3398cff1ca448921ae85de6ce6e845687adf2d114e2207e931ec46" exitCode=0 Apr 16 23:37:21.304316 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:21.303827 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" event={"ID":"d63fb101-969e-4d7e-b1f6-6ec73abaae7f","Type":"ContainerDied","Data":"7641a6dbdf3398cff1ca448921ae85de6ce6e845687adf2d114e2207e931ec46"} Apr 16 23:37:22.309068 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:22.309034 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" event={"ID":"d63fb101-969e-4d7e-b1f6-6ec73abaae7f","Type":"ContainerStarted","Data":"99a2cfa76bf38c383667b09a16d8eac80d0cbd4053d0935c11f4de3c393a9c2f"} Apr 16 23:37:22.309437 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:22.309245 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:22.325756 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:22.325712 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" podStartSLOduration=8.163191438 podStartE2EDuration="8.325700992s" podCreationTimestamp="2026-04-16 23:37:14 +0000 UTC" firstStartedPulling="2026-04-16 23:37:21.304563899 +0000 UTC m=+648.906482977" lastFinishedPulling="2026-04-16 23:37:21.467073445 +0000 UTC m=+649.068992531" observedRunningTime="2026-04-16 23:37:22.324639951 +0000 UTC m=+649.926559053" watchObservedRunningTime="2026-04-16 23:37:22.325700992 +0000 UTC m=+649.927620092" Apr 16 23:37:33.325634 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:33.325608 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7" Apr 16 23:37:55.075797 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.075762 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm"] Apr 16 23:37:55.079003 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.078985 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.081174 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.081152 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 16 23:37:55.087434 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.087412 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm"] Apr 16 23:37:55.222342 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.222318 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.222560 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.222507 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.222648 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.222621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt94s\" (UniqueName: \"kubernetes.io/projected/c4217387-0009-4cb6-89ac-270751df5add-kube-api-access-bt94s\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.222715 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.222687 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.222815 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.222795 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4217387-0009-4cb6-89ac-270751df5add-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.223051 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.223028 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.324131 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.324108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.324213 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.324141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bt94s\" (UniqueName: \"kubernetes.io/projected/c4217387-0009-4cb6-89ac-270751df5add-kube-api-access-bt94s\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.324213 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.324173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.324295 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.324219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4217387-0009-4cb6-89ac-270751df5add-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.324295 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.324274 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.324380 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.324324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.324609 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.324583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.324697 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.324633 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.324697 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.324667 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.326520 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.326473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c4217387-0009-4cb6-89ac-270751df5add-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.326774 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.326754 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c4217387-0009-4cb6-89ac-270751df5add-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.331490 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.331468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt94s\" (UniqueName: \"kubernetes.io/projected/c4217387-0009-4cb6-89ac-270751df5add-kube-api-access-bt94s\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-swhhm\" (UID: \"c4217387-0009-4cb6-89ac-270751df5add\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.390806 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.390769 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:37:55.509442 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:55.509294 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm"] Apr 16 23:37:55.512268 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:37:55.512243 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4217387_0009_4cb6_89ac_270751df5add.slice/crio-005d7a0d0249b7c6c03da84d9c0e00d189ca3bb0c244f67c754fa044c7ec3a12 WatchSource:0}: Error finding container 005d7a0d0249b7c6c03da84d9c0e00d189ca3bb0c244f67c754fa044c7ec3a12: Status 404 returned error can't find the container with id 005d7a0d0249b7c6c03da84d9c0e00d189ca3bb0c244f67c754fa044c7ec3a12 Apr 16 23:37:56.426024 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:56.425989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" event={"ID":"c4217387-0009-4cb6-89ac-270751df5add","Type":"ContainerStarted","Data":"3b7ec7d7c52f8fa984579c6e3575dcfaa319e90f32a0f61ee958748a1f88abfd"} Apr 16 23:37:56.426024 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:37:56.426022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" event={"ID":"c4217387-0009-4cb6-89ac-270751df5add","Type":"ContainerStarted","Data":"005d7a0d0249b7c6c03da84d9c0e00d189ca3bb0c244f67c754fa044c7ec3a12"} Apr 16 23:38:01.446224 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:38:01.446187 2573 generic.go:358] "Generic (PLEG): container finished" podID="c4217387-0009-4cb6-89ac-270751df5add" containerID="3b7ec7d7c52f8fa984579c6e3575dcfaa319e90f32a0f61ee958748a1f88abfd" exitCode=0 Apr 16 23:38:01.446687 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:38:01.446257 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" event={"ID":"c4217387-0009-4cb6-89ac-270751df5add","Type":"ContainerDied","Data":"3b7ec7d7c52f8fa984579c6e3575dcfaa319e90f32a0f61ee958748a1f88abfd"} Apr 16 23:38:02.451202 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:38:02.451172 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" event={"ID":"c4217387-0009-4cb6-89ac-270751df5add","Type":"ContainerStarted","Data":"e92ea562550ddb1b1ab700be281dbb092b34ebb10d74364ce24126330cf0150a"} Apr 16 23:38:02.451574 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:38:02.451387 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:38:02.467683 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:38:02.467637 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" podStartSLOduration=7.269823734 podStartE2EDuration="7.467622567s" podCreationTimestamp="2026-04-16 23:37:55 +0000 UTC" firstStartedPulling="2026-04-16 23:38:01.4468437 +0000 UTC m=+689.048762779" lastFinishedPulling="2026-04-16 23:38:01.644642534 +0000 UTC m=+689.246561612" observedRunningTime="2026-04-16 23:38:02.46680277 +0000 UTC m=+690.068721875" watchObservedRunningTime="2026-04-16 23:38:02.467622567 +0000 UTC m=+690.069541674" Apr 16 23:38:13.468266 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:38:13.468190 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-swhhm" Apr 16 23:39:51.628127 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:51.628093 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-876898b5-847cb"] Apr 16 23:39:51.628694 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:51.628382 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-876898b5-847cb" podUID="baed2a62-318f-41b4-83b1-6c843b8d9644" containerName="manager" containerID="cri-o://8130135ca78f7ad5dfb8d8feb8649c165339c2df69d5d0590bf720367cf2fe7b" gracePeriod=10 Apr 16 23:39:51.849496 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:51.849457 2573 generic.go:358] "Generic (PLEG): container finished" podID="baed2a62-318f-41b4-83b1-6c843b8d9644" containerID="8130135ca78f7ad5dfb8d8feb8649c165339c2df69d5d0590bf720367cf2fe7b" exitCode=0 Apr 16 23:39:51.849652 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:51.849498 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-876898b5-847cb" event={"ID":"baed2a62-318f-41b4-83b1-6c843b8d9644","Type":"ContainerDied","Data":"8130135ca78f7ad5dfb8d8feb8649c165339c2df69d5d0590bf720367cf2fe7b"} Apr 16 23:39:51.868929 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:51.868907 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-876898b5-847cb" Apr 16 23:39:51.994689 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:51.994654 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkl76\" (UniqueName: \"kubernetes.io/projected/baed2a62-318f-41b4-83b1-6c843b8d9644-kube-api-access-gkl76\") pod \"baed2a62-318f-41b4-83b1-6c843b8d9644\" (UID: \"baed2a62-318f-41b4-83b1-6c843b8d9644\") " Apr 16 23:39:51.996724 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:51.996701 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baed2a62-318f-41b4-83b1-6c843b8d9644-kube-api-access-gkl76" (OuterVolumeSpecName: "kube-api-access-gkl76") pod "baed2a62-318f-41b4-83b1-6c843b8d9644" (UID: "baed2a62-318f-41b4-83b1-6c843b8d9644"). InnerVolumeSpecName "kube-api-access-gkl76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 23:39:52.095670 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:52.095648 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gkl76\" (UniqueName: \"kubernetes.io/projected/baed2a62-318f-41b4-83b1-6c843b8d9644-kube-api-access-gkl76\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 16 23:39:52.854217 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:52.854188 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-876898b5-847cb" Apr 16 23:39:52.854217 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:52.854204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-876898b5-847cb" event={"ID":"baed2a62-318f-41b4-83b1-6c843b8d9644","Type":"ContainerDied","Data":"eb9402407a5213ab6a7c2de96cd71f064a000c5538bf1d01f7972a17107ffb57"} Apr 16 23:39:52.854695 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:52.854247 2573 scope.go:117] "RemoveContainer" containerID="8130135ca78f7ad5dfb8d8feb8649c165339c2df69d5d0590bf720367cf2fe7b" Apr 16 23:39:52.876200 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:52.876175 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-876898b5-847cb"] Apr 16 23:39:52.880039 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:52.880018 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-876898b5-847cb"] Apr 16 23:39:53.063720 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.063685 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baed2a62-318f-41b4-83b1-6c843b8d9644" path="/var/lib/kubelet/pods/baed2a62-318f-41b4-83b1-6c843b8d9644/volumes" Apr 16 23:39:53.579004 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.578971 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-876898b5-5nsbc"] Apr 16 23:39:53.579385 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.579373 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="baed2a62-318f-41b4-83b1-6c843b8d9644" containerName="manager" Apr 16 23:39:53.579385 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.579386 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="baed2a62-318f-41b4-83b1-6c843b8d9644" containerName="manager" Apr 16 23:39:53.579484 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.579475 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="baed2a62-318f-41b4-83b1-6c843b8d9644" containerName="manager" Apr 16 23:39:53.583792 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.583772 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-876898b5-5nsbc" Apr 16 23:39:53.585804 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.585781 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-wf8l7\"" Apr 16 23:39:53.590060 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.590034 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-876898b5-5nsbc"] Apr 16 23:39:53.709158 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.709129 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncmzc\" (UniqueName: \"kubernetes.io/projected/b03b846e-a94c-4070-b954-2da4be1af76f-kube-api-access-ncmzc\") pod \"maas-controller-876898b5-5nsbc\" (UID: \"b03b846e-a94c-4070-b954-2da4be1af76f\") " pod="opendatahub/maas-controller-876898b5-5nsbc" Apr 16 23:39:53.810043 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.810013 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncmzc\" (UniqueName: \"kubernetes.io/projected/b03b846e-a94c-4070-b954-2da4be1af76f-kube-api-access-ncmzc\") pod \"maas-controller-876898b5-5nsbc\" (UID: \"b03b846e-a94c-4070-b954-2da4be1af76f\") " pod="opendatahub/maas-controller-876898b5-5nsbc" Apr 16 23:39:53.817291 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.817270 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncmzc\" (UniqueName: \"kubernetes.io/projected/b03b846e-a94c-4070-b954-2da4be1af76f-kube-api-access-ncmzc\") pod \"maas-controller-876898b5-5nsbc\" (UID: \"b03b846e-a94c-4070-b954-2da4be1af76f\") " pod="opendatahub/maas-controller-876898b5-5nsbc" Apr 16 23:39:53.896857 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:53.896782 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-876898b5-5nsbc" Apr 16 23:39:54.018413 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:54.018391 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-876898b5-5nsbc"] Apr 16 23:39:54.021074 ip-10-0-136-153 kubenswrapper[2573]: W0416 23:39:54.021051 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb03b846e_a94c_4070_b954_2da4be1af76f.slice/crio-603df9cb3f5fdcd256f9db6cbe5b41aa4c059c6630a6e10e5a55314a9ccef9b5 WatchSource:0}: Error finding container 603df9cb3f5fdcd256f9db6cbe5b41aa4c059c6630a6e10e5a55314a9ccef9b5: Status 404 returned error can't find the container with id 603df9cb3f5fdcd256f9db6cbe5b41aa4c059c6630a6e10e5a55314a9ccef9b5 Apr 16 23:39:54.864908 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:54.864831 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-876898b5-5nsbc" event={"ID":"b03b846e-a94c-4070-b954-2da4be1af76f","Type":"ContainerStarted","Data":"cc154f962947ee43ee73e487a0ad872b4a1f6425a73b45d1504e632c74efbb88"} Apr 16 23:39:54.864908 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:54.864865 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-876898b5-5nsbc" event={"ID":"b03b846e-a94c-4070-b954-2da4be1af76f","Type":"ContainerStarted","Data":"603df9cb3f5fdcd256f9db6cbe5b41aa4c059c6630a6e10e5a55314a9ccef9b5"} Apr 16 23:39:54.864908 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:54.864891 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-876898b5-5nsbc" Apr 16 23:39:54.881111 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:39:54.881067 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-876898b5-5nsbc" podStartSLOduration=1.296421024 podStartE2EDuration="1.881052815s" podCreationTimestamp="2026-04-16 23:39:53 +0000 UTC" firstStartedPulling="2026-04-16 23:39:54.022373425 +0000 UTC m=+801.624292505" lastFinishedPulling="2026-04-16 23:39:54.607005217 +0000 UTC m=+802.208924296" observedRunningTime="2026-04-16 23:39:54.878348758 +0000 UTC m=+802.480267856" watchObservedRunningTime="2026-04-16 23:39:54.881052815 +0000 UTC m=+802.482971915" Apr 16 23:40:05.873696 ip-10-0-136-153 kubenswrapper[2573]: I0416 23:40:05.873664 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-876898b5-5nsbc" Apr 17 00:00:00.229177 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.229146 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29606400-cvrh8"] Apr 17 00:00:00.232573 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.232556 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:00.234796 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.234776 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"serviceca\"" Apr 17 00:00:00.234914 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.234791 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"pruner-dockercfg-6vd9v\"" Apr 17 00:00:00.238406 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.238379 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29606400-cvrh8"] Apr 17 00:00:00.318754 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.318730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db2d0971-2c87-4895-8796-19e99f1e0b6d-serviceca\") pod \"image-pruner-29606400-cvrh8\" (UID: \"db2d0971-2c87-4895-8796-19e99f1e0b6d\") " pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:00.318861 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.318759 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxx7f\" (UniqueName: \"kubernetes.io/projected/db2d0971-2c87-4895-8796-19e99f1e0b6d-kube-api-access-gxx7f\") pod \"image-pruner-29606400-cvrh8\" (UID: \"db2d0971-2c87-4895-8796-19e99f1e0b6d\") " pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:00.420243 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.420221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db2d0971-2c87-4895-8796-19e99f1e0b6d-serviceca\") pod \"image-pruner-29606400-cvrh8\" (UID: \"db2d0971-2c87-4895-8796-19e99f1e0b6d\") " pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:00.420368 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.420257 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxx7f\" (UniqueName: \"kubernetes.io/projected/db2d0971-2c87-4895-8796-19e99f1e0b6d-kube-api-access-gxx7f\") pod \"image-pruner-29606400-cvrh8\" (UID: \"db2d0971-2c87-4895-8796-19e99f1e0b6d\") " pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:00.420870 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.420854 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db2d0971-2c87-4895-8796-19e99f1e0b6d-serviceca\") pod \"image-pruner-29606400-cvrh8\" (UID: \"db2d0971-2c87-4895-8796-19e99f1e0b6d\") " pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:00.428284 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.428262 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxx7f\" (UniqueName: \"kubernetes.io/projected/db2d0971-2c87-4895-8796-19e99f1e0b6d-kube-api-access-gxx7f\") pod \"image-pruner-29606400-cvrh8\" (UID: \"db2d0971-2c87-4895-8796-19e99f1e0b6d\") " pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:00.561807 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.561748 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:00.690556 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.690517 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29606400-cvrh8"] Apr 17 00:00:00.693136 ip-10-0-136-153 kubenswrapper[2573]: W0417 00:00:00.693103 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb2d0971_2c87_4895_8796_19e99f1e0b6d.slice/crio-411a36e73220911bec05fd350a2fc23088d85af184fc61abd57a366c2630fa7e WatchSource:0}: Error finding container 411a36e73220911bec05fd350a2fc23088d85af184fc61abd57a366c2630fa7e: Status 404 returned error can't find the container with id 411a36e73220911bec05fd350a2fc23088d85af184fc61abd57a366c2630fa7e Apr 17 00:00:00.695055 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:00.695040 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 00:00:01.136690 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:01.136660 2573 generic.go:358] "Generic (PLEG): container finished" podID="db2d0971-2c87-4895-8796-19e99f1e0b6d" containerID="bb43781e96f6cc2539bbe487f3975437fa05e4ecdb86a820acf654ab850892b2" exitCode=0 Apr 17 00:00:01.136830 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:01.136751 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29606400-cvrh8" event={"ID":"db2d0971-2c87-4895-8796-19e99f1e0b6d","Type":"ContainerDied","Data":"bb43781e96f6cc2539bbe487f3975437fa05e4ecdb86a820acf654ab850892b2"} Apr 17 00:00:01.136830 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:01.136786 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29606400-cvrh8" event={"ID":"db2d0971-2c87-4895-8796-19e99f1e0b6d","Type":"ContainerStarted","Data":"411a36e73220911bec05fd350a2fc23088d85af184fc61abd57a366c2630fa7e"} Apr 17 00:00:02.265224 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:02.265200 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:02.338693 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:02.338667 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db2d0971-2c87-4895-8796-19e99f1e0b6d-serviceca\") pod \"db2d0971-2c87-4895-8796-19e99f1e0b6d\" (UID: \"db2d0971-2c87-4895-8796-19e99f1e0b6d\") " Apr 17 00:00:02.338820 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:02.338755 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxx7f\" (UniqueName: \"kubernetes.io/projected/db2d0971-2c87-4895-8796-19e99f1e0b6d-kube-api-access-gxx7f\") pod \"db2d0971-2c87-4895-8796-19e99f1e0b6d\" (UID: \"db2d0971-2c87-4895-8796-19e99f1e0b6d\") " Apr 17 00:00:02.339022 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:02.338998 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db2d0971-2c87-4895-8796-19e99f1e0b6d-serviceca" (OuterVolumeSpecName: "serviceca") pod "db2d0971-2c87-4895-8796-19e99f1e0b6d" (UID: "db2d0971-2c87-4895-8796-19e99f1e0b6d"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 00:00:02.339105 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:02.339062 2573 reconciler_common.go:299] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db2d0971-2c87-4895-8796-19e99f1e0b6d-serviceca\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 17 00:00:02.340869 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:02.340839 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2d0971-2c87-4895-8796-19e99f1e0b6d-kube-api-access-gxx7f" (OuterVolumeSpecName: "kube-api-access-gxx7f") pod "db2d0971-2c87-4895-8796-19e99f1e0b6d" (UID: "db2d0971-2c87-4895-8796-19e99f1e0b6d"). InnerVolumeSpecName "kube-api-access-gxx7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 00:00:02.439515 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:02.439489 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxx7f\" (UniqueName: \"kubernetes.io/projected/db2d0971-2c87-4895-8796-19e99f1e0b6d-kube-api-access-gxx7f\") on node \"ip-10-0-136-153.ec2.internal\" DevicePath \"\"" Apr 17 00:00:03.144103 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:03.144068 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29606400-cvrh8" event={"ID":"db2d0971-2c87-4895-8796-19e99f1e0b6d","Type":"ContainerDied","Data":"411a36e73220911bec05fd350a2fc23088d85af184fc61abd57a366c2630fa7e"} Apr 17 00:00:03.144229 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:03.144109 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="411a36e73220911bec05fd350a2fc23088d85af184fc61abd57a366c2630fa7e" Apr 17 00:00:03.144229 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:03.144086 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29606400-cvrh8" Apr 17 00:00:47.493813 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:47.493734 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7ffd5c4797-w8z9h_7ecf3e0a-e474-4329-bcb4-f372306d8c76/maas-api/0.log" Apr 17 00:00:47.709289 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:47.709260 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-876898b5-5nsbc_b03b846e-a94c-4070-b954-2da4be1af76f/manager/0.log" Apr 17 00:00:48.150113 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:48.150081 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8bf69b96d-w9qtc_c2bb5cfe-3a91-4560-b161-1a47586b8cae/manager/0.log" Apr 17 00:00:48.254963 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:48.254938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-p6rfz_18144aff-a024-41c0-9d7e-5f920d8c7582/postgres/0.log" Apr 17 00:00:49.681997 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:49.681969 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-n2qcm_233c7354-6628-431b-93c7-bbe1ff4897e2/manager/0.log" Apr 17 00:00:49.895944 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:49.895914 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-dg642_e089524b-e02c-46a3-adc8-2e96780cb78c/registry-server/0.log" Apr 17 00:00:50.214865 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:50.214834 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-kg69x_ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259/manager/0.log" Apr 17 00:00:50.546062 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:50.545998 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f2sllk_cbd01c30-104e-4334-bb60-ecf20077118e/istio-proxy/0.log" Apr 17 00:00:50.972308 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:50.972282 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-wcqv2_dfe10a9a-9555-4894-830a-e5027f15736c/istio-proxy/0.log" Apr 17 00:00:51.079402 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:51.079377 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5897bbd496-f9ftq_99c15427-bc7a-4ab4-b1c3-c425e97bf63c/router/0.log" Apr 17 00:00:51.550810 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:51.550765 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-swhhm_c4217387-0009-4cb6-89ac-270751df5add/storage-initializer/0.log" Apr 17 00:00:51.557346 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:51.557327 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-swhhm_c4217387-0009-4cb6-89ac-270751df5add/main/0.log" Apr 17 00:00:51.769025 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:51.768991 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx_f5f75960-784c-4ec0-a1b7-72b83b7f1558/storage-initializer/0.log" Apr 17 00:00:51.776442 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:51.776422 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccbhrrx_f5f75960-784c-4ec0-a1b7-72b83b7f1558/main/0.log" Apr 17 00:00:51.878807 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:51.878751 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7_d63fb101-969e-4d7e-b1f6-6ec73abaae7f/storage-initializer/0.log" Apr 17 00:00:51.885439 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:51.885420 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-tj4b7_d63fb101-969e-4d7e-b1f6-6ec73abaae7f/main/0.log" Apr 17 00:00:59.369578 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:59.369549 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bfwst_18bc468b-4161-4117-b3e2-0607b32b04f4/global-pull-secret-syncer/0.log" Apr 17 00:00:59.548182 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:59.548154 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-4np6h_a8419133-562e-45db-abac-da560b01e6d9/konnectivity-agent/0.log" Apr 17 00:00:59.727147 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:00:59.727126 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-153.ec2.internal_be4647434e0fd79acaa8b46b7c163d65/haproxy/0.log" Apr 17 00:01:04.333257 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:04.333208 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-n2qcm_233c7354-6628-431b-93c7-bbe1ff4897e2/manager/0.log" Apr 17 00:01:04.391124 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:04.391040 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-dg642_e089524b-e02c-46a3-adc8-2e96780cb78c/registry-server/0.log" Apr 17 00:01:04.531790 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:04.531764 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-kg69x_ae2c3a5c-ba2b-4ae2-bd5e-71ab11f49259/manager/0.log" Apr 17 00:01:05.826340 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:05.826314 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ba9b01ef-0fef-42ae-8f54-792e6c3257fe/alertmanager/0.log" Apr 17 00:01:05.848726 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:05.848692 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ba9b01ef-0fef-42ae-8f54-792e6c3257fe/config-reloader/0.log" Apr 17 00:01:05.870650 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:05.870630 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ba9b01ef-0fef-42ae-8f54-792e6c3257fe/kube-rbac-proxy-web/0.log" Apr 17 00:01:05.891034 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:05.891014 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ba9b01ef-0fef-42ae-8f54-792e6c3257fe/kube-rbac-proxy/0.log" Apr 17 00:01:05.911967 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:05.911943 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ba9b01ef-0fef-42ae-8f54-792e6c3257fe/kube-rbac-proxy-metric/0.log" Apr 17 00:01:05.933662 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:05.933642 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ba9b01ef-0fef-42ae-8f54-792e6c3257fe/prom-label-proxy/0.log" Apr 17 00:01:05.955053 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:05.955036 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_ba9b01ef-0fef-42ae-8f54-792e6c3257fe/init-config-reloader/0.log" Apr 17 00:01:05.989458 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:05.989436 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-w9vw5_84d158a6-dfbe-407b-be72-b82ba7380fd8/cluster-monitoring-operator/0.log" Apr 17 00:01:06.104797 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.104744 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-krbwl_5f9f37f0-a72b-46b9-8ebb-6751c4d3d1da/monitoring-plugin/0.log" Apr 17 00:01:06.201385 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.201360 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7h258_4a216be1-f2fc-4496-b025-c083bf935ba0/node-exporter/0.log" Apr 17 00:01:06.220900 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.220883 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7h258_4a216be1-f2fc-4496-b025-c083bf935ba0/kube-rbac-proxy/0.log" Apr 17 00:01:06.253816 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.253796 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7h258_4a216be1-f2fc-4496-b025-c083bf935ba0/init-textfile/0.log" Apr 17 00:01:06.429820 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.429799 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9bd0ade-c387-412e-8587-c2fa7e09c914/prometheus/0.log" Apr 17 00:01:06.452452 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.452434 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9bd0ade-c387-412e-8587-c2fa7e09c914/config-reloader/0.log" Apr 17 00:01:06.472010 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.471981 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9bd0ade-c387-412e-8587-c2fa7e09c914/thanos-sidecar/0.log" Apr 17 00:01:06.491544 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.491522 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9bd0ade-c387-412e-8587-c2fa7e09c914/kube-rbac-proxy-web/0.log" Apr 17 00:01:06.511416 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.511399 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9bd0ade-c387-412e-8587-c2fa7e09c914/kube-rbac-proxy/0.log" Apr 17 00:01:06.533306 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.533286 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9bd0ade-c387-412e-8587-c2fa7e09c914/kube-rbac-proxy-thanos/0.log" Apr 17 00:01:06.559584 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.559553 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b9bd0ade-c387-412e-8587-c2fa7e09c914/init-config-reloader/0.log" Apr 17 00:01:06.628223 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.628201 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-s8vl8_dcf7a7a8-e110-4fa0-a351-479b0d42756c/prometheus-operator-admission-webhook/0.log" Apr 17 00:01:06.724950 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.724900 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc7b44fd6-kfxd7_1511bfed-c802-44ba-95c8-e41c1ab89d60/thanos-query/0.log" Apr 17 00:01:06.744799 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.744780 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc7b44fd6-kfxd7_1511bfed-c802-44ba-95c8-e41c1ab89d60/kube-rbac-proxy-web/0.log" Apr 17 00:01:06.767973 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.767955 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc7b44fd6-kfxd7_1511bfed-c802-44ba-95c8-e41c1ab89d60/kube-rbac-proxy/0.log" Apr 17 00:01:06.786434 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.786414 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc7b44fd6-kfxd7_1511bfed-c802-44ba-95c8-e41c1ab89d60/prom-label-proxy/0.log" Apr 17 00:01:06.806148 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.806128 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc7b44fd6-kfxd7_1511bfed-c802-44ba-95c8-e41c1ab89d60/kube-rbac-proxy-rules/0.log" Apr 17 00:01:06.825361 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:06.825341 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5fc7b44fd6-kfxd7_1511bfed-c802-44ba-95c8-e41c1ab89d60/kube-rbac-proxy-metrics/0.log" Apr 17 00:01:08.894561 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.894499 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7"] Apr 17 00:01:08.895113 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.895093 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db2d0971-2c87-4895-8796-19e99f1e0b6d" containerName="image-pruner" Apr 17 00:01:08.895203 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.895117 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2d0971-2c87-4895-8796-19e99f1e0b6d" containerName="image-pruner" Apr 17 00:01:08.895257 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.895221 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="db2d0971-2c87-4895-8796-19e99f1e0b6d" containerName="image-pruner" Apr 17 00:01:08.898581 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.898561 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:08.900791 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.900766 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zs59b\"/\"kube-root-ca.crt\"" Apr 17 00:01:08.900916 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.900766 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-zs59b\"/\"default-dockercfg-rtvn7\"" Apr 17 00:01:08.901411 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.901388 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-zs59b\"/\"openshift-service-ca.crt\"" Apr 17 00:01:08.906511 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.906260 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7"] Apr 17 00:01:08.981203 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.981163 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-sys\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:08.981309 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.981217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-podres\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:08.981309 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.981285 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9c6l\" (UniqueName: \"kubernetes.io/projected/4138eedb-d995-4fb4-8f08-fb2203ce4522-kube-api-access-t9c6l\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:08.981398 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.981321 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-proc\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:08.981398 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:08.981360 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-lib-modules\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.082015 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.081992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-proc\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.082111 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.082026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-lib-modules\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.082111 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.082079 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-sys\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.082111 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.082101 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-podres\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.082111 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.082101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-proc\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.082314 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.082116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t9c6l\" (UniqueName: \"kubernetes.io/projected/4138eedb-d995-4fb4-8f08-fb2203ce4522-kube-api-access-t9c6l\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.082314 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.082145 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-sys\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.082314 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.082222 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-lib-modules\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.082314 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.082243 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4138eedb-d995-4fb4-8f08-fb2203ce4522-podres\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.089328 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.089302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9c6l\" (UniqueName: \"kubernetes.io/projected/4138eedb-d995-4fb4-8f08-fb2203ce4522-kube-api-access-t9c6l\") pod \"perf-node-gather-daemonset-v8vk7\" (UID: \"4138eedb-d995-4fb4-8f08-fb2203ce4522\") " pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.209219 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.209199 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:09.312493 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.312471 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-tjplv_c00f22ca-f714-4922-9bff-d7d78acd7194/volume-data-source-validator/0.log" Apr 17 00:01:09.333698 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.333677 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7"] Apr 17 00:01:09.335941 ip-10-0-136-153 kubenswrapper[2573]: W0417 00:01:09.335911 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4138eedb_d995_4fb4_8f08_fb2203ce4522.slice/crio-9bf633b8892aad1f6e0c511397002e57e800e6d05700983058a990a9a3e34a00 WatchSource:0}: Error finding container 9bf633b8892aad1f6e0c511397002e57e800e6d05700983058a990a9a3e34a00: Status 404 returned error can't find the container with id 9bf633b8892aad1f6e0c511397002e57e800e6d05700983058a990a9a3e34a00 Apr 17 00:01:09.381318 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:09.381287 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" event={"ID":"4138eedb-d995-4fb4-8f08-fb2203ce4522","Type":"ContainerStarted","Data":"9bf633b8892aad1f6e0c511397002e57e800e6d05700983058a990a9a3e34a00"} Apr 17 00:01:10.063410 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:10.063380 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2cxtr_fbf7e722-63c3-4ad8-b126-d39966fa38f3/dns/0.log" Apr 17 00:01:10.084066 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:10.084048 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2cxtr_fbf7e722-63c3-4ad8-b126-d39966fa38f3/kube-rbac-proxy/0.log" Apr 17 00:01:10.239430 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:10.239406 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m8gnt_a40691e1-4691-4b8d-b935-ff781629806d/dns-node-resolver/0.log" Apr 17 00:01:10.387850 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:10.387785 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" event={"ID":"4138eedb-d995-4fb4-8f08-fb2203ce4522","Type":"ContainerStarted","Data":"02091e094ded106a706887df9923c79c347de0ca4c1c988bac48936b8558f1e1"} Apr 17 00:01:10.387955 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:10.387937 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:10.401647 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:10.401610 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" podStartSLOduration=2.4015978110000002 podStartE2EDuration="2.401597811s" podCreationTimestamp="2026-04-17 00:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 00:01:10.400926381 +0000 UTC m=+2078.002845479" watchObservedRunningTime="2026-04-17 00:01:10.401597811 +0000 UTC m=+2078.003516912" Apr 17 00:01:10.666673 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:10.666599 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-pruner-29606400-cvrh8_db2d0971-2c87-4895-8796-19e99f1e0b6d/image-pruner/0.log" Apr 17 00:01:10.698342 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:10.698314 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7f59d69d8f-zflqg_b7bb1187-a715-4b7a-aa6e-0dc183dc753d/registry/0.log" Apr 17 00:01:10.759395 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:10.759372 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-kt5t6_e24fef17-a7c3-497e-a65f-9458686c8ea2/node-ca/0.log" Apr 17 00:01:11.553766 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:11.553735 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557f2sllk_cbd01c30-104e-4334-bb60-ecf20077118e/istio-proxy/0.log" Apr 17 00:01:11.745839 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:11.745803 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-wcqv2_dfe10a9a-9555-4894-830a-e5027f15736c/istio-proxy/0.log" Apr 17 00:01:11.765343 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:11.765321 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5897bbd496-f9ftq_99c15427-bc7a-4ab4-b1c3-c425e97bf63c/router/0.log" Apr 17 00:01:12.293583 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:12.293551 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fdk7d_9ee4f424-f877-4116-acf8-e4a8c70b5329/serve-healthcheck-canary/0.log" Apr 17 00:01:12.888898 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:12.888862 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xdd9t_6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b/kube-rbac-proxy/0.log" Apr 17 00:01:12.908354 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:12.908329 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xdd9t_6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b/exporter/0.log" Apr 17 00:01:12.929817 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:12.929796 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xdd9t_6e7dbba2-9f6d-49ca-be01-6f1dcc09a72b/extractor/0.log" Apr 17 00:01:14.825671 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:14.825637 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7ffd5c4797-w8z9h_7ecf3e0a-e474-4329-bcb4-f372306d8c76/maas-api/0.log" Apr 17 00:01:14.892007 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:14.891981 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-876898b5-5nsbc_b03b846e-a94c-4070-b954-2da4be1af76f/manager/0.log" Apr 17 00:01:15.000560 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:15.000503 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-8bf69b96d-w9qtc_c2bb5cfe-3a91-4560-b161-1a47586b8cae/manager/0.log" Apr 17 00:01:15.024066 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:15.024039 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-p6rfz_18144aff-a024-41c0-9d7e-5f920d8c7582/postgres/0.log" Apr 17 00:01:16.107879 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:16.107834 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-86bf875fd5-mkdfq_e0df32e9-0dba-4bd0-a534-af2d3a867627/manager/0.log" Apr 17 00:01:16.401928 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:16.401856 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-zs59b/perf-node-gather-daemonset-v8vk7" Apr 17 00:01:22.224252 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.224226 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz8rl_5cb2ede9-d7bb-4d1f-9aca-83f7715b5495/kube-multus-additional-cni-plugins/0.log" Apr 17 00:01:22.243709 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.243687 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz8rl_5cb2ede9-d7bb-4d1f-9aca-83f7715b5495/egress-router-binary-copy/0.log" Apr 17 00:01:22.262932 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.262916 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz8rl_5cb2ede9-d7bb-4d1f-9aca-83f7715b5495/cni-plugins/0.log" Apr 17 00:01:22.283353 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.283330 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz8rl_5cb2ede9-d7bb-4d1f-9aca-83f7715b5495/bond-cni-plugin/0.log" Apr 17 00:01:22.302579 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.302564 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz8rl_5cb2ede9-d7bb-4d1f-9aca-83f7715b5495/routeoverride-cni/0.log" Apr 17 00:01:22.322398 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.322369 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz8rl_5cb2ede9-d7bb-4d1f-9aca-83f7715b5495/whereabouts-cni-bincopy/0.log" Apr 17 00:01:22.342037 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.342020 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-wz8rl_5cb2ede9-d7bb-4d1f-9aca-83f7715b5495/whereabouts-cni/0.log" Apr 17 00:01:22.425016 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.424993 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vr4xk_7e54684e-7b38-446d-a750-4bb17e3d69b0/kube-multus/0.log" Apr 17 00:01:22.485224 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.485172 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qhz5v_3ed66159-86cf-4f43-824b-3905a5019c1c/network-metrics-daemon/0.log" Apr 17 00:01:22.503138 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:22.503121 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qhz5v_3ed66159-86cf-4f43-824b-3905a5019c1c/kube-rbac-proxy/0.log" Apr 17 00:01:23.347453 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:23.347429 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pn57_9c864774-d6f1-4d07-8798-126023861e55/ovn-controller/0.log" Apr 17 00:01:23.378090 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:23.378062 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pn57_9c864774-d6f1-4d07-8798-126023861e55/ovn-acl-logging/0.log" Apr 17 00:01:23.395378 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:23.395361 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pn57_9c864774-d6f1-4d07-8798-126023861e55/kube-rbac-proxy-node/0.log" Apr 17 00:01:23.416388 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:23.416358 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pn57_9c864774-d6f1-4d07-8798-126023861e55/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 00:01:23.434628 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:23.434609 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pn57_9c864774-d6f1-4d07-8798-126023861e55/northd/0.log" Apr 17 00:01:23.454163 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:23.454145 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pn57_9c864774-d6f1-4d07-8798-126023861e55/nbdb/0.log" Apr 17 00:01:23.473484 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:23.473465 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pn57_9c864774-d6f1-4d07-8798-126023861e55/sbdb/0.log" Apr 17 00:01:23.563422 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:23.563397 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pn57_9c864774-d6f1-4d07-8798-126023861e55/ovnkube-controller/0.log" Apr 17 00:01:25.206288 ip-10-0-136-153 kubenswrapper[2573]: I0417 00:01:25.206264 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-svp68_df573a86-aad9-4aaa-9c40-5e9073ed8760/network-check-target-container/0.log"